Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update self serve replication SQL to accept daily granularity as interval #234

Conversation

chenselena
Copy link
Collaborator

@chenselena chenselena commented Oct 21, 2024

Summary

This PR adds support for daily granularity as an valid input in the SQL API for the interval parameter as part of self serve replication.

Now the following SQL is valid and will not throw an exception:

ALTER TABLE db.testTable SET POLICY (REPLICATION=({destination:'a', interval:1D}))

where interval is supported to take daily and hourly inputs.
The validations for 'D' and 'H' inputs will continue to be performed at the server-side level to accept 12H and 1/2/3D inputs. The PR for that can be found here.

Changes

  • Client-facing API Changes
  • Internal API Changes
  • Bug Fixes
  • New Features
  • Performance Improvements
  • Code Style
  • Refactoring
  • Documentation
  • Tests

For all the boxes checked, please include additional details of the changes made in this pull request.

Testing Done

  • Manually Tested on local docker setup. Please include commands ran, and their output.
  • Added new tests for the changes made.
  • Updated existing tests to reflect the changes made.
  • No tests added or updated. Please explain why. If unsure, please feel free to ask for help.
  • Some other form of testing like staging or soak time in production. Please explain.

For all the boxes checked, include a detailed description of the testing done for the changes made in this pull request.
Updated unit tests for SQL statements and tested in local docker:

scala> spark.sql("ALTER TABLE u_tableowner.test SET POLICY (REPLICATION=({destination:'a', interval:1D}))")
res6: org.apache.spark.sql.DataFrame = []
scala> spark.sql("ALTER TABLE u_tableowner.test SET POLICY (REPLICATION=({destination:'a', interval:12H}))")
res8: org.apache.spark.sql.DataFrame = []

Using anything other than h/H or d/D throws an exception:

scala> spark.sql("ALTER TABLE u_tableowner.test SET POLICY (REPLICATION=({destination:'a', interval:1}))")
com.linkedin.openhouse.spark.sql.catalyst.parser.extensions.OpenhouseParseException: no viable alternative at input 'interval:1'; line 1 pos 82
scala> spark.sql("ALTER TABLE u_tableowner.test SET POLICY (REPLICATION=({destination:'a', interval:1Y}))")
com.linkedin.openhouse.spark.sql.catalyst.parser.extensions.OpenhouseParseException: no viable alternative at input 'interval:1Y'; line 1 pos 82
  at com.linkedin.openhouse.spark.sql.catalyst.parser.extensions.OpenhouseParseErrorListener$.syntaxError(OpenhouseSparkSqlExtensionsParser.scala:123)
  at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41)

Additional Information

  • Breaking Changes
  • Deprecations
  • Large PR broken into smaller PRs, and PR plan linked in the description.

For all the boxes checked, include additional details of the changes made in this pull request.

@chenselena chenselena marked this pull request as ready for review October 22, 2024 00:16
@rohitkum2506
Copy link
Collaborator

Thanks for adding the DAY granularity. Description looks incomplete, can you update?
Also, can you add example case supporting this case: Using anything other than h/H or d/D throws an exception.?

@chenselena
Copy link
Collaborator Author

Thanks for adding the DAY granularity. Description looks incomplete, can you update? Also, can you add example case supporting this case: Using anything other than h/H or d/D throws an exception.?

@rohitkum2506 updated for both 👍

Copy link
Collaborator

@rohitkum2506 rohitkum2506 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for updating

@chenselena chenselena merged commit ba2f50a into linkedin:main Oct 22, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants