Skip to content

Add support for Spark SQL #29

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 7, 2016
Merged

Add support for Spark SQL #29

merged 3 commits into from
Jun 7, 2016

Conversation

dvirsky
Copy link
Contributor

@dvirsky dvirsky commented May 25, 2016

No description provided.

redisConfig.hosts.filter(node => { node.startSlot <= slot && node.endSlot >= slot }).filter(_.idx == 0)(0)
}
def insert(data: DataFrame, overwrite: Boolean): Unit = {
data.foreach{
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be much much faster if we partition the data somehow and group the queries to be sent in a pipeline. The current way is very slow.

WDYT?

Copy link
Contributor Author

@dvirsky dvirsky May 25, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we can add to Endpoint something like "getPipeline" and "syncPipeline". so we can flush at the end and not for each row.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea, will use pipeline first.

@dvirsky
Copy link
Contributor Author

dvirsky commented Jun 7, 2016

I'm merging it, but some documentation is needed still, including in the main doc. I'll create a separate issue for it.

@dvirsky dvirsky merged commit 5b23452 into master Jun 7, 2016
@gkorland gkorland deleted the sql branch January 26, 2021 06:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants