backend-spark-sql function

Backend: Databricks Spark SQL

Backend: Databricks Spark SQL

See vignette("translation-function") and vignette("translation-verb") for details of overall translation technology. Key differences for this backend are better translation of statistical aggregate functions (e.g. var(), median()) and use of temporary views instead of temporary tables when copying data.

Use simulate_spark_sql() with lazy_frame() to see simulated SQL without converting to live access database.

simulate_spark_sql()

Examples

library(dplyr, warn.conflicts = FALSE) lf <- lazy_frame(a = TRUE, b = 1, d = 2, c = "z", con = simulate_spark_sql()) lf %>% summarise(x = median(d, na.rm = TRUE)) lf %>% summarise(x = var(c, na.rm = TRUE), .by = d) lf %>% mutate(x = first(c)) lf %>% mutate(x = first(c), .by = d)
  • Maintainer: Hadley Wickham
  • License: MIT + file LICENSE
  • Last published: 2024-03-19