├── .Rbuildignore ├── .github ├── CODEOWNERS └── workflows │ └── R-CMD-check.yaml ├── .gitignore ├── CODE_OF_CONDUCT.md ├── DESCRIPTION ├── LICENSE ├── NAMESPACE ├── R ├── connections.R ├── globals.R ├── init.R ├── table_functions.R └── utils.R ├── README.Rmd ├── README.md ├── dbcooper.Rproj ├── man-roxygen └── con-id.R ├── man ├── assign_table_function.Rd ├── dbc_add_connection.Rd ├── dbc_clear_connections.Rd ├── dbc_execute.Rd ├── dbc_get_connection.Rd ├── dbc_init.Rd ├── dbc_list_tables.Rd ├── dbc_param.Rd ├── dbc_query.Rd ├── dbc_table.Rd ├── query_from_str.Rd └── set_option.Rd └── tests ├── testthat.R └── testthat ├── test-connections.R ├── test-init.R └── test-utils.R /.Rbuildignore: -------------------------------------------------------------------------------- 1 | ^.*\.Rproj$ 2 | ^\.Rproj\.user$ 3 | ^README.Rmd 4 | ^CODE_OF_CONDUCT.md 5 | man-roxygen/* 6 | .travis.yml 7 | .github 8 | -------------------------------------------------------------------------------- /.github/CODEOWNERS: -------------------------------------------------------------------------------- 1 | # These owners will be the default owners for everything in 2 | # the repo. Unless a later match takes precedence, 3 | # the users below will be requested for review 4 | # when someone opens a pull request. 5 | * @chriscardillo 6 | -------------------------------------------------------------------------------- /.github/workflows/R-CMD-check.yaml: -------------------------------------------------------------------------------- 1 | on: 2 | push: 3 | branches: 4 | - main 5 | pull_request: 6 | branches: 7 | - main 8 | 9 | name: R-CMD-check 10 | 11 | jobs: 12 | R-CMD-check: 13 | runs-on: macOS-latest 14 | steps: 15 | - uses: actions/checkout@v3 16 | - uses: r-lib/actions/setup-r@v2 17 | - name: Install dependencies 18 | run: | 19 | install.packages(c("remotes", "rcmdcheck")) 20 | remotes::install_deps(dependencies = TRUE) 21 | shell: Rscript {0} 22 | - name: Check 23 | run: rcmdcheck::rcmdcheck(args = "--no-manual", error_on = "error") 24 | shell: Rscript {0} 25 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # History files 2 | .Rhistory 3 | .Rapp.history 4 | 5 | # Session Data files 6 | .RData 7 | 8 | # User-specific files 9 | .Ruserdata 10 | 11 | # Example code in package build process 12 | *-Ex.R 13 | 14 | # Output files from R CMD build 15 | /*.tar.gz 16 | 17 | # Output files from R CMD check 18 | /*.Rcheck/ 19 | 20 | # RStudio files 21 | .Rproj.user/ 22 | 23 | # produced vignettes 24 | vignettes/*.html 25 | vignettes/*.pdf 26 | 27 | # OAuth2 token, see https://github.com/hadley/httr/releases/tag/v0.3 28 | .httr-oauth 29 | 30 | # knitr and R markdown default cache directories 31 | *_cache/ 32 | /cache/ 33 | 34 | # Temporary files created by R markdown 35 | *.utf8.md 36 | *.knit.md 37 | 38 | # R Environment Variables 39 | .Renviron 40 | .Rproj.user 41 | # *.Rproj 42 | # .Rbuildignore 43 | .Rdata 44 | .DS_Store 45 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Code of Conduct 2 | 3 | As contributors and maintainers of this project, we pledge to respect all people who 4 | contribute through reporting issues, posting feature requests, updating documentation, 5 | submitting pull requests or patches, and other activities. 6 | 7 | We are committed to making participation in this project a harassment-free experience for 8 | everyone, regardless of level of experience, gender, gender identity and expression, 9 | sexual orientation, disability, personal appearance, body size, race, ethnicity, age, or religion. 10 | 11 | Examples of unacceptable behavior by participants include the use of sexual language or 12 | imagery, derogatory comments or personal attacks, trolling, public or private harassment, 13 | insults, or other unprofessional conduct. 14 | 15 | Project maintainers have the right and responsibility to remove, edit, or reject comments, 16 | commits, code, wiki edits, issues, and other contributions that are not aligned to this 17 | Code of Conduct. Project maintainers who do not follow the Code of Conduct may be removed 18 | from the project team. 19 | 20 | Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by 21 | opening an issue or contacting one or more of the project maintainers. 22 | 23 | This Code of Conduct is adapted from the Contributor Covenant 24 | (https://www.contributor-covenant.org), version 1.0.0, available at 25 | https://contributor-covenant.org/version/1/0/0/. 26 | -------------------------------------------------------------------------------- /DESCRIPTION: -------------------------------------------------------------------------------- 1 | Package: dbcooper 2 | Title: Access Databases Through Autogenerated Functions 3 | Version: 0.1.2 4 | Authors@R: c( 5 | person("Chris", "Cardillo", email = "cfcardillo23@gmail.com", role = c("cre", "aut")), 6 | person("David", "Robinson", email = "admiral.david@gmail.com", role = "aut"), 7 | person("Ramnath", "Vaidyanathan", role = "ctb")) 8 | Maintainer: Chris Cardillo 9 | Description: Create accessor functions for each table in a database, 10 | making it easy to work with them using the 'dbplyr' package. 11 | License: MIT + file LICENSE 12 | Encoding: UTF-8 13 | Imports: 14 | dplyr, 15 | DBI, 16 | purrr, 17 | dbplyr, 18 | snakecase 19 | RoxygenNote: 7.2.1 20 | Suggests: 21 | yaml, 22 | Lahman, 23 | RSQLite, 24 | testthat (>= 3.0.0) 25 | Config/testthat/edition: 3 26 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | YEAR: 2022 2 | COPYRIGHT HOLDER: Chris Cardillo 3 | -------------------------------------------------------------------------------- /NAMESPACE: -------------------------------------------------------------------------------- 1 | # Generated by roxygen2: do not edit by hand 2 | 3 | S3method(dbc_init,default) 4 | S3method(dbc_init,src_sql) 5 | S3method(dbc_list_tables,PqConnection) 6 | S3method(dbc_list_tables,Snowflake) 7 | S3method(dbc_list_tables,SnowflakeDBConnection) 8 | S3method(dbc_list_tables,default) 9 | S3method(dbc_list_tables,duckdb_connection) 10 | export(dbc_add_connection) 11 | export(dbc_clear_connections) 12 | export(dbc_get_connection) 13 | export(dbc_init) 14 | export(dbc_param) 15 | importFrom(dplyr,"%>%") 16 | importFrom(snakecase,to_snake_case) 17 | -------------------------------------------------------------------------------- /R/connections.R: -------------------------------------------------------------------------------- 1 | # Work with global connections 2 | 3 | #' Add a connection or pool to the global options 4 | #' 5 | #' dbcooper maintains a named list of connections or 6 | #' (recommended) pools. This internal function adds 7 | #' one. 8 | #' 9 | #' @param con Connection or pool object 10 | #' @template con-id 11 | #' 12 | #' @export 13 | dbc_add_connection <- function(con, con_id) { 14 | connections <- getOption("dbc_connections") 15 | 16 | if (is.null(connections)) { 17 | connections <- list() 18 | } 19 | 20 | connections[[con_id]] <- con 21 | 22 | options(dbc_connections = connections) 23 | } 24 | 25 | #' Retrieve a connection or pool from the global options 26 | #' 27 | #' dbcooper maintains a named list of connections or 28 | #' (recommended) pools. This internal function retrieves 29 | #' one. 30 | #' 31 | #' @template con-id 32 | #' 33 | #' @export 34 | dbc_get_connection <- function(con_id) { 35 | connections <- getOption("dbc_connections") 36 | 37 | con <- connections[[con_id]] 38 | 39 | if (is.null(con)) { 40 | stop("Connection not found: ", con) 41 | } 42 | 43 | con 44 | } 45 | 46 | #' Clear all connections created by dbc 47 | #' 48 | #' @export 49 | dbc_clear_connections <- function() { 50 | connections <- getOption("dbc_connections") 51 | 52 | for (con in connections) { 53 | purrr::possibly(DBI::dbDisconnect, NULL)(con) 54 | } 55 | 56 | options(dbc_connections = list()) 57 | } -------------------------------------------------------------------------------- /R/globals.R: -------------------------------------------------------------------------------- 1 | globalVariables(c("table_name_raw", "table_schema", "schema_name", "name", 2 | "database_name")) 3 | -------------------------------------------------------------------------------- /R/init.R: -------------------------------------------------------------------------------- 1 | # Initialize the tables based on a database connection 2 | 3 | #' Create one table function 4 | #' 5 | #' @param table_name Name of the table 6 | #' @template con-id 7 | #' @param env Environment in which to create the table accessors, such 8 | #' as the global environment or a package namespace. 9 | #' @param table_formatter Optionally, a function to clean the table name 10 | #' before turning it into a function name, such as removing prefixes 11 | #' @param table_post Post-processing to perform on each table before 12 | #' returning it 13 | #' 14 | #' @importFrom snakecase to_snake_case 15 | assign_table_function <- function(table_name, 16 | con_id, 17 | env = parent.frame(), 18 | table_formatter = snakecase::to_snake_case, 19 | table_post = identity) { 20 | # Create the function 21 | fun <- function() table_post(dbc_table(table_name, con_id)) 22 | attr(fun, 'connection') <- con_id 23 | attr(fun, 'table') <- table_name 24 | 25 | clean_name <- table_formatter(table_name) 26 | 27 | function_name <- paste0(con_id, "_", clean_name) 28 | assign(function_name, fun, pos = env) 29 | } 30 | 31 | #' Create functions for accessing a remote database 32 | #' 33 | #' Create and assign functions that make accessing a database easy. 34 | #' These include \code{tbl_} functions for each table in the database, 35 | #' as well as \code{query_[id]} and \code{execute_[id]} functions for 36 | #' querying and executing SQL in the database. 37 | #' 38 | #' @param con A DBI-compliant database connection object, or a 39 | #' \code{src_dbi}. 40 | #' @param con_id A short string that identifies the database. This 41 | #' is used to create functions \code{query_}, \code{tbl_} and 42 | #' \code{execute_} with appropriate names, as well as to cache the 43 | #' connection globally. 44 | #' @param env Environment in which to create the table accessors, such 45 | #' as the global environment or a package namespace. 46 | #' @param tables Optionally, a vector of tables. Useful if dbcooper's 47 | #' table listing functions don't work for a database, or if you want to 48 | #' use only a subset of tables. 49 | #' @param table_prefix Optionally, a prefix to append to each table, 50 | #' usually a schema. 51 | #' @param table_formatter Optionally, a function to clean the table name 52 | #' before turning it into a function name, such as removing prefixes. 53 | #' By default, \code{\link[snakecase]{to_snake_case}}. 54 | #' @param table_post Optionally, post-processing to perform on each table before 55 | #' returning it. 56 | #' @param ... Arguments passed on to \code{dbc_init.default}. 57 | #' 58 | #' @examples 59 | #' 60 | #' library(dplyr) 61 | #' library(dbplyr) 62 | #' 63 | #' # Initialize based on a SQL src or connection object 64 | #' src <- lahman_sqlite() 65 | #' dbc_init(src, "lahman") 66 | #' 67 | #' ## Tables 68 | #' 69 | #' # Access each table using autocompleted functions 70 | #' lahman_batting() 71 | #' 72 | #' # Can also pass the name of a table as a string to lahman_tbl 73 | #' lahman_tbl("Pitching") 74 | #' 75 | #' # Pass no argument to get a vector of all the tables 76 | #' lahman_list() 77 | #' 78 | #' # Run a SQL query 79 | #' lahman_query("SELECT COUNT(*) FROM Managers") 80 | #' 81 | #' # Execute queries that change the database 82 | #' lahman_execute("CREATE TABLE Players AS 83 | #' SELECT playerID, sum(AB) as AB FROM Batting GROUP BY playerID" 84 | #' ) 85 | #' 86 | #' lahman_tbl("Players") 87 | #' 88 | #' lahman_execute("DROP TABLE Players") 89 | #' 90 | #' @export 91 | dbc_init <- function(con, con_id, env = parent.frame(), ...) { 92 | UseMethod("dbc_init") 93 | } 94 | 95 | #' @rdname dbc_init 96 | #' @export 97 | dbc_init.default <- function(con, 98 | con_id, 99 | env = parent.frame(), 100 | tables = NULL, 101 | table_prefix = NULL, 102 | table_formatter = snakecase::to_snake_case, 103 | table_post = identity, 104 | assign_table_functions = TRUE, 105 | ...) { 106 | # Assign the connection/pool globally so it can be accessed later 107 | dbc_add_connection(con, con_id) 108 | 109 | # Create functions for querying and getting a single table 110 | list_fun <- function(query) { dbc_list_tables(dbc_get_connection(con_id)) } 111 | assign(paste0(con_id, "_list"), list_fun, pos = env) 112 | 113 | query_fun <- function(query) { table_post(dbc_query(query, con_id)) } 114 | assign(paste0(con_id, "_query"), query_fun, pos = env) 115 | 116 | tbl_fun <- function(table_name = NULL) { table_post(dbc_table(paste0(table_prefix, table_name), con_id)) } 117 | assign(paste0(con_id, "_tbl"), tbl_fun, pos = env) 118 | 119 | exec_fun <- function(query) { dbc_execute(query, con_id) } 120 | assign(paste0(con_id, "_execute"), exec_fun, pos = env) 121 | 122 | src_fun <- function() { dbplyr::src_dbi(dbc_get_connection(con_id)) } 123 | assign(paste0(con_id, "_src"), src_fun, pos = env) 124 | 125 | if (assign_table_functions) { 126 | 127 | # Create functions for each individual table 128 | if (is.null(tables)) { 129 | tables <- dbc_list_tables(con) 130 | } 131 | 132 | invisible(purrr::map(tables, 133 | assign_table_function, 134 | con_id, 135 | env = env, 136 | table_formatter = table_formatter, 137 | table_post = table_post)) 138 | } 139 | } 140 | 141 | #' @rdname dbc_init 142 | #' @export 143 | dbc_init.src_sql <- function(con, con_id, env = parent.frame(), ...) { 144 | dbc_init(con$con, con_id, env = parent.frame(), ...) 145 | } 146 | -------------------------------------------------------------------------------- /R/table_functions.R: -------------------------------------------------------------------------------- 1 | ## First, utilities to access a table from a connection ID. 2 | 3 | #' Access a single table 4 | #' 5 | #' @param table_name Name of the table in the database. If no table is 6 | #' provided, returns a list of tables as a character vector. 7 | #' @template con-id 8 | dbc_table <- function(table_name = NULL, con_id) { 9 | con <- dbc_get_connection(con_id) 10 | 11 | if (is.null(table_name)) { 12 | return(dbc_list_tables(con)) 13 | } 14 | 15 | # Try using a select instead of in_schema 16 | return(dplyr::tbl(con, dbplyr::sql(paste("( SELECT * FROM", table_name, ")")))) 17 | 18 | if(!grepl("\\.", table_name)){ 19 | dplyr::tbl(con, table_name) 20 | } else if (grepl(".\\.", table_name)) { 21 | table_split <- strsplit(table_name, "\\.")[[1]] 22 | schema <- table_split[1] 23 | table <- table_split[2] 24 | dplyr::tbl(con, dbplyr::in_schema(schema, table)) 25 | } else if(grepl("^\\.", table_name)){ 26 | table <- paste0("public", gsub("^\\.", "_", table_name)) 27 | dplyr::tbl(con, table) 28 | } 29 | } 30 | 31 | #' Given a string, turn it into a SQL query 32 | #' 33 | #' Internal function to pull a query from a string. If the string is in a 34 | #' YAML or plain text file, read it 35 | #' 36 | #' @param query A string or filename 37 | query_from_str <- function(query) { 38 | if (grepl("*.yml$", query)) { 39 | if (!requireNamespace("yaml", quietly = TRUE)) { 40 | stop("Reading a yml file requires the yaml package to be installed") 41 | } 42 | 43 | yaml <- yaml::read_yaml(query) 44 | 45 | if("sql" %in% names(yaml)) { 46 | query <- yaml[["sql"]] 47 | } else { 48 | stop(paste0("No parameter 'sql' found in file ", query)) 49 | } 50 | } else if (file.exists(query)) { 51 | # Query is in a file; read it; remove frontmatter 52 | query <- gsub("^---\n.*---\n", "", paste(readLines(query), collapse = "\n")) 53 | } 54 | paste("(", query, ")") 55 | } 56 | 57 | #' Run a query on a SQL database and get a remote table back 58 | #' 59 | #' @param query Either a SQL query as a string, a file containing a SQL 60 | #' query, or a YAML file with a \code{sql} parameter. 61 | #' @template con-id 62 | #' 63 | #' @return A \code{tbl_sql} with the remote results 64 | dbc_query <- function(query, con_id) { 65 | dplyr::tbl(dbc_get_connection(con_id), dplyr::sql(query_from_str(query))) 66 | } 67 | 68 | #' Execute a query on a SQL database 69 | #' 70 | #' @param query Either a SQL query as a string, a file containing a SQL 71 | #' query, or a YAML file with a \code{sql} parameter. 72 | #' @template con-id 73 | dbc_execute <- function(query, con_id) { 74 | DBI::dbExecute(dbc_get_connection(con_id), query) 75 | } 76 | -------------------------------------------------------------------------------- /R/utils.R: -------------------------------------------------------------------------------- 1 | #' Require a parameter by name 2 | #' 3 | #' Throws an error if the parameter is not set. 4 | #' 5 | #' @param param_name Parameter to look up with \code{\link[base]{Sys.getenv}}. 6 | #' 7 | #' @export 8 | dbc_param <- function(param_name) { 9 | ret <- Sys.getenv(param_name) 10 | if (!nzchar(ret)) { 11 | stop("You must set ", param_name, " in your .Renviron (then restart R)") 12 | } 13 | ret 14 | } 15 | 16 | #' Set an option by name 17 | #' 18 | #' Utility. This is a bit difficult to do with the options() function. 19 | #' 20 | #' @param option_name Name 21 | #' @param value Value 22 | set_option <- function(option_name, value) { 23 | args <- stats::setNames(list(value), option_name) 24 | do.call(options, args) 25 | } 26 | 27 | #' List tables in a database 28 | #' 29 | #' dbListTables doesn't work for all types of databases, especially 30 | #' ones that have schemas. We use this approach instead. 31 | #' Work in progress as we test across different database types. 32 | #' 33 | #' @param con A connection or pool object 34 | #' @param exclude_schemas Schemas for which no tables should be returned. 35 | #' The default excludes information_schema and pg_catalog, which typically 36 | #' contain database metadata. 37 | #' 38 | #' @importFrom dplyr %>% 39 | dbc_list_tables <- function(con, exclude_schemas) { 40 | UseMethod("dbc_list_tables") 41 | } 42 | 43 | #' @rdname dbc_list_tables 44 | #' @export 45 | dbc_list_tables.default <- function(con, 46 | exclude_schemas = c("information_schema", "pg_catalog")) { 47 | tables <- DBI::dbListTables(con) 48 | 49 | # Remove ones that match the regex 50 | exclude_regex <- paste0(exclude_schemas, "\\.", collapse = "|") 51 | tables <- tables[!grepl(exclude_regex, tables)] 52 | return(tables) 53 | } 54 | 55 | #' @rdname dbc_list_tables 56 | #' @export 57 | dbc_list_tables.PqConnection <- function(con, exclude_schemas = c("information_schema", "pg_catalog")) { 58 | # Base Query, currently meant for postgres and mysql only 59 | query <- "SELECT CONCAT(table_schema, '.', table_name) AS table_name_raw, table_schema 60 | FROM information_schema.tables" 61 | 62 | tables <- DBI::dbGetQuery(con, query) %>% 63 | dplyr::filter(!table_schema %in% exclude_schemas) %>% 64 | dplyr::select(table_name_raw) %>% 65 | dplyr::pull() 66 | 67 | return(tables) 68 | } 69 | 70 | #' @rdname dbc_list_tables 71 | #' @export 72 | dbc_list_tables.Snowflake <- function(con, 73 | exclude_schemas = c("INFORMATION_SCHEMA")) { 74 | tables <- DBI::dbGetQuery(con, "SHOW TERSE TABLES") %>% 75 | dplyr::select(database_name, schema_name, name) 76 | 77 | views <- DBI::dbGetQuery(con, "SHOW TERSE VIEWS") %>% 78 | dplyr::select(database_name, schema_name, name) 79 | 80 | dplyr::bind_rows(tables, views) %>% 81 | dplyr::filter(!schema_name %in% exclude_schemas) %>% 82 | with(paste(database_name, schema_name, name, sep = ".")) 83 | } 84 | 85 | #' @rdname dbc_list_tables 86 | #' @export 87 | dbc_list_tables.SnowflakeDBConnection <- function(con, 88 | exclude_schemas = c("INFORMATION_SCHEMA")) { 89 | # The dplyr.snowflakedb package 90 | dbc_list_tables.Snowflake(con, exclude_schemas) 91 | } 92 | 93 | 94 | #' @rdname dbc_list_tables 95 | #' @export 96 | dbc_list_tables.duckdb_connection <- function(con, 97 | exclude_schemas = c("INFORMATION_SCHEMA")) { 98 | dplyr::pull(DBI::dbGetQuery(con, "SELECT table_schema || '.' || table_name AS table_name FROM information_schema.tables"), "table_name") 99 | } 100 | -------------------------------------------------------------------------------- /README.Rmd: -------------------------------------------------------------------------------- 1 | 2 | 3 | ```{r, include = FALSE} 4 | knitr::opts_chunk$set( 5 | collapse = TRUE, 6 | comment = "#>", 7 | fig.path = "man/figures/README-", 8 | out.width = "100%", 9 | error = FALSE 10 | ) 11 | ``` 12 | 13 | # dbcooper 14 | 15 | 16 | ![R-CMD-check](https://github.com/pipeline-tools/dbcooper/workflows/R-CMD-check/badge.svg) 17 | 18 | 19 | The dbcooper package turns a database connection into a collection of functions, handling logic for keeping track of connections and letting you take advantage of autocompletion when exploring a database. 20 | 21 | It's especially helpful to use when authoring database-specific R packages, for instance in an internal company package or one wrapping a public data source. 22 | 23 | The package's name is a reference to the bandit [D.B. Cooper](https://en.wikipedia.org/wiki/D._B._Cooper). 24 | 25 | * For the Python version of the package, see [machow/dbcooper-py](https://github.com/machow/dbcooper-py). 26 | * For an example of database packages created with dbcooper, see [stackbigquery](https://github.com/dgrtwo/stackbigquery/) or [lahmancooper](https://github.com/pipeline-tools/lahmancooper) 27 | * For some slides about the package, see [here](http://varianceexplained.org/files/dbcooper-rstudio-conf-2022.pdf) 28 | 29 | ## Installation 30 | 31 | You can install the development version from [GitHub](https://github.com/) with: 32 | 33 | ``` r 34 | # install.packages("devtools") 35 | devtools::install_github("pipeline-tools/dbcooper") 36 | ``` 37 | 38 | ## Example 39 | 40 | ### Initializing the functions 41 | 42 | The dbcooper package asks you to create the connection first. As an example, we'll use the Lahman baseball database packaged with dbplyr. 43 | 44 | ```{r message = FALSE} 45 | library(dplyr) 46 | 47 | lahman_db <- dbplyr::lahman_sqlite() 48 | lahman_db 49 | ``` 50 | 51 | You set up dbcooper with the `dbc_init` function, passing it a prefix `lahman` that will apply to all the functions it creates. 52 | 53 | ```{r} 54 | library(dbcooper) 55 | dbc_init(lahman_db, "lahman") 56 | ``` 57 | 58 | `dbc_init` then creates user-friendly accessor functions in your global environment. (You could also pass it an environment in which the functions will be created). 59 | 60 | ### Using database functions 61 | 62 | `dbc_init` adds several functions when it initializes a database source. In this case, each will start with the `lahman_` prefix. 63 | 64 | * `_list`: Get a list of tables 65 | * `_tbl`: Access a table that can be worked with in dbplyr 66 | * `_query`: Perform of a SQL query and work with the result 67 | * `_execute`: Execute a query (such as a `CREATE` or `DROP`) 68 | * `_src`: Retrieve a `dbi_src` for the database 69 | 70 | For instance, we could start by finding the names of the tables in the Lahman database. 71 | 72 | ```{r} 73 | lahman_list() 74 | ``` 75 | 76 | We can access one of these tables with `lahman_tbl()`, then put it through any kind of dplyr operation. 77 | 78 | ```{r} 79 | lahman_tbl("Batting") 80 | 81 | lahman_tbl("Batting") %>% 82 | count(teamID, sort = TRUE) 83 | ``` 84 | 85 | If we'd rather write SQL than dplyr, we could also run `lahman_query()` (which can also take a filename). 86 | 87 | ```{r} 88 | lahman_query("SELECT 89 | playerID, 90 | sum(AB) as AB 91 | FROM Batting 92 | GROUP BY playerID") 93 | ``` 94 | 95 | Finally, `lahman_execute()` is for commands like `CREATE` and `DROP` that don't return a table, but rather execute a command on the database. 96 | 97 | ```{r} 98 | lahman_execute("CREATE TABLE Players AS 99 | SELECT playerID, SUM(AB) AS AB 100 | FROM Batting 101 | GROUP BY playerID") 102 | 103 | lahman_tbl("Players") 104 | 105 | lahman_execute("DROP TABLE Players") 106 | ``` 107 | 108 | ### Autocompleted tables 109 | 110 | Besides the `_list`, `_tbl`, `_query`, and `_execute` functions, the package also creates auto-completed table accessors. 111 | 112 | ```{r} 113 | # Same result as lahman_tbl("Batting") 114 | lahman_batting() 115 | 116 | # Same result as lahman_tbl("Managers") %>% count() 117 | lahman_managers() %>% 118 | count() 119 | ``` 120 | 121 | These are useful because they let you use auto-complete to complete table names as you're exploring a data source. 122 | 123 | ## Code of Conduct 124 | 125 | Please note that the 'dbcooper' project is released with a 126 | [Contributor Code of Conduct](https://github.com/pipeline-tools/dbcooper/blob/main/CODE_OF_CONDUCT.md). 127 | By contributing to this project, you agree to abide by its terms. 128 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | # dbcooper 6 | 7 | 8 | ![R-CMD-check](https://github.com/pipeline-tools/dbcooper/workflows/R-CMD-check/badge.svg) 9 | 10 | 11 | The dbcooper package turns a database connection into a collection of functions, handling logic for keeping track of connections and letting you take advantage of autocompletion when exploring a database. 12 | 13 | It's especially helpful to use when authoring database-specific R packages, for instance in an internal company package or one wrapping a public data source. 14 | 15 | The package's name is a reference to the bandit [D.B. Cooper](https://en.wikipedia.org/wiki/D._B._Cooper). 16 | 17 | * For the Python version of the package, see [machow/dbcooper-py](https://github.com/machow/dbcooper-py). 18 | * For an example of database packages created with dbcooper, see [stackbigquery](https://github.com/dgrtwo/stackbigquery/) or [lahmancooper](https://github.com/pipeline-tools/lahmancooper) 19 | * For some slides about the package, see [here](http://varianceexplained.org/files/dbcooper-rstudio-conf-2022.pdf) 20 | 21 | ## Installation 22 | 23 | You can install the development version from [GitHub](https://github.com/) with: 24 | 25 | ``` r 26 | # install.packages("devtools") 27 | devtools::install_github("pipeline-tools/dbcooper") 28 | ``` 29 | 30 | ## Example 31 | 32 | ### Initializing the functions 33 | 34 | The dbcooper package asks you to create the connection first. As an example, we'll use the Lahman baseball database packaged with dbplyr. 35 | 36 | 37 | ```r 38 | library(dplyr) 39 | 40 | lahman_db <- dbplyr::lahman_sqlite() 41 | lahman_db 42 | #> 43 | #> Path: /tmp/RtmpeApR4v/lahman.sqlite 44 | #> Extensions: TRUE 45 | ``` 46 | 47 | You set up dbcooper with the `dbc_init` function, passing it a prefix `lahman` that will apply to all the functions it creates. 48 | 49 | 50 | ```r 51 | library(dbcooper) 52 | dbc_init(lahman_db, "lahman") 53 | ``` 54 | 55 | `dbc_init` then creates user-friendly accessor functions in your global environment. (You could also pass it an environment in which the functions will be created). 56 | 57 | ### Using database functions 58 | 59 | `dbc_init` adds several functions when it initializes a database source. In this case, each will start with the `lahman_` prefix. 60 | 61 | * `_list`: Get a list of tables 62 | * `_tbl`: Access a table that can be worked with in dbplyr 63 | * `_query`: Perform of a SQL query and work with the result 64 | * `_execute`: Execute a query (such as a `CREATE` or `DROP`) 65 | * `_src`: Retrieve a `dbi_src` for the database 66 | 67 | For instance, we could start by finding the names of the tables in the Lahman database. 68 | 69 | 70 | ```r 71 | lahman_list() 72 | #> [1] "AllstarFull" "Appearances" "AwardsManagers" "AwardsPlayers" "AwardsShareManagers" "AwardsSharePlayers" "Batting" "BattingPost" 73 | #> [9] "CollegePlaying" "Fielding" "FieldingOF" "FieldingOFsplit" "FieldingPost" "HallOfFame" "HomeGames" "LahmanData" 74 | #> [17] "Managers" "ManagersHalf" "Parks" "People" "Pitching" "PitchingPost" "Salaries" "Schools" 75 | #> [25] "SeriesPost" "Teams" "TeamsFranchises" "TeamsHalf" "sqlite_stat1" "sqlite_stat4" 76 | ``` 77 | 78 | We can access one of these tables with `lahman_tbl()`, then put it through any kind of dplyr operation. 79 | 80 | 81 | ```r 82 | lahman_tbl("Batting") 83 | #> # Source: SQL [?? x 22] 84 | #> # Database: sqlite 3.39.2 [/tmp/RtmpeApR4v/lahman.sqlite] 85 | #> playerID yearID stint teamID lgID G AB R H X2B X3B HR RBI SB CS BB SO IBB HBP SH SF GIDP 86 | #> 87 | #> 1 abercda01 1871 1 TRO NA 1 4 0 0 0 0 0 0 0 0 0 0 NA NA NA NA 0 88 | #> 2 addybo01 1871 1 RC1 NA 25 118 30 32 6 0 0 13 8 1 4 0 NA NA NA NA 0 89 | #> 3 allisar01 1871 1 CL1 NA 29 137 28 40 4 5 0 19 3 1 2 5 NA NA NA NA 1 90 | #> 4 allisdo01 1871 1 WS3 NA 27 133 28 44 10 2 2 27 1 1 0 2 NA NA NA NA 0 91 | #> 5 ansonca01 1871 1 RC1 NA 25 120 29 39 11 3 0 16 6 2 2 1 NA NA NA NA 0 92 | #> 6 armstbo01 1871 1 FW1 NA 12 49 9 11 2 1 0 5 0 1 0 1 NA NA NA NA 0 93 | #> 7 barkeal01 1871 1 RC1 NA 1 4 0 1 0 0 0 2 0 0 1 0 NA NA NA NA 0 94 | #> 8 barnero01 1871 1 BS1 NA 31 157 66 63 10 9 0 34 11 6 13 1 NA NA NA NA 1 95 | #> 9 barrebi01 1871 1 FW1 NA 1 5 1 1 1 0 0 1 0 0 0 0 NA NA NA NA 0 96 | #> 10 barrofr01 1871 1 BS1 NA 18 86 13 13 2 1 0 11 1 0 0 0 NA NA NA NA 0 97 | #> # … with more rows 98 | 99 | lahman_tbl("Batting") %>% 100 | count(teamID, sort = TRUE) 101 | #> # Source: SQL [?? x 2] 102 | #> # Database: sqlite 3.39.2 [/tmp/RtmpeApR4v/lahman.sqlite] 103 | #> # Ordered by: desc(n) 104 | #> teamID n 105 | #> 106 | #> 1 CHN 5129 107 | #> 2 PHI 5026 108 | #> 3 PIT 4984 109 | #> 4 SLN 4904 110 | #> 5 CIN 4786 111 | #> 6 CLE 4731 112 | #> 7 BOS 4571 113 | #> 8 NYA 4530 114 | #> 9 CHA 4523 115 | #> 10 DET 4462 116 | #> # … with more rows 117 | ``` 118 | 119 | If we'd rather write SQL than dplyr, we could also run `lahman_query()` (which can also take a filename). 120 | 121 | 122 | ```r 123 | lahman_query("SELECT 124 | playerID, 125 | sum(AB) as AB 126 | FROM Batting 127 | GROUP BY playerID") 128 | #> # Source: SQL [?? x 2] 129 | #> # Database: sqlite 3.39.2 [/tmp/RtmpeApR4v/lahman.sqlite] 130 | #> playerID AB 131 | #> 132 | #> 1 aardsda01 4 133 | #> 2 aaronha01 12364 134 | #> 3 aaronto01 944 135 | #> 4 aasedo01 5 136 | #> 5 abadan01 21 137 | #> 6 abadfe01 9 138 | #> 7 abadijo01 49 139 | #> 8 abbated01 3044 140 | #> 9 abbeybe01 225 141 | #> 10 abbeych01 1756 142 | #> # … with more rows 143 | ``` 144 | 145 | Finally, `lahman_execute()` is for commands like `CREATE` and `DROP` that don't return a table, but rather execute a command on the database. 146 | 147 | 148 | ```r 149 | lahman_execute("CREATE TABLE Players AS 150 | SELECT playerID, SUM(AB) AS AB 151 | FROM Batting 152 | GROUP BY playerID") 153 | #> [1] 0 154 | 155 | lahman_tbl("Players") 156 | #> # Source: SQL [?? x 2] 157 | #> # Database: sqlite 3.39.2 [/tmp/RtmpeApR4v/lahman.sqlite] 158 | #> playerID AB 159 | #> 160 | #> 1 aardsda01 4 161 | #> 2 aaronha01 12364 162 | #> 3 aaronto01 944 163 | #> 4 aasedo01 5 164 | #> 5 abadan01 21 165 | #> 6 abadfe01 9 166 | #> 7 abadijo01 49 167 | #> 8 abbated01 3044 168 | #> 9 abbeybe01 225 169 | #> 10 abbeych01 1756 170 | #> # … with more rows 171 | 172 | lahman_execute("DROP TABLE Players") 173 | #> [1] 0 174 | ``` 175 | 176 | ### Autocompleted tables 177 | 178 | Besides the `_list`, `_tbl`, `_query`, and `_execute` functions, the package also creates auto-completed table accessors. 179 | 180 | 181 | ```r 182 | # Same result as lahman_tbl("Batting") 183 | lahman_batting() 184 | #> # Source: SQL [?? x 22] 185 | #> # Database: sqlite 3.39.2 [/tmp/RtmpeApR4v/lahman.sqlite] 186 | #> playerID yearID stint teamID lgID G AB R H X2B X3B HR RBI SB CS BB SO IBB HBP SH SF GIDP 187 | #> 188 | #> 1 abercda01 1871 1 TRO NA 1 4 0 0 0 0 0 0 0 0 0 0 NA NA NA NA 0 189 | #> 2 addybo01 1871 1 RC1 NA 25 118 30 32 6 0 0 13 8 1 4 0 NA NA NA NA 0 190 | #> 3 allisar01 1871 1 CL1 NA 29 137 28 40 4 5 0 19 3 1 2 5 NA NA NA NA 1 191 | #> 4 allisdo01 1871 1 WS3 NA 27 133 28 44 10 2 2 27 1 1 0 2 NA NA NA NA 0 192 | #> 5 ansonca01 1871 1 RC1 NA 25 120 29 39 11 3 0 16 6 2 2 1 NA NA NA NA 0 193 | #> 6 armstbo01 1871 1 FW1 NA 12 49 9 11 2 1 0 5 0 1 0 1 NA NA NA NA 0 194 | #> 7 barkeal01 1871 1 RC1 NA 1 4 0 1 0 0 0 2 0 0 1 0 NA NA NA NA 0 195 | #> 8 barnero01 1871 1 BS1 NA 31 157 66 63 10 9 0 34 11 6 13 1 NA NA NA NA 1 196 | #> 9 barrebi01 1871 1 FW1 NA 1 5 1 1 1 0 0 1 0 0 0 0 NA NA NA NA 0 197 | #> 10 barrofr01 1871 1 BS1 NA 18 86 13 13 2 1 0 11 1 0 0 0 NA NA NA NA 0 198 | #> # … with more rows 199 | 200 | # Same result as lahman_tbl("Managers") %>% count() 201 | lahman_managers() %>% 202 | count() 203 | #> # Source: SQL [1 x 1] 204 | #> # Database: sqlite 3.39.2 [/tmp/RtmpeApR4v/lahman.sqlite] 205 | #> n 206 | #> 207 | #> 1 3684 208 | ``` 209 | 210 | These are useful because they let you use auto-complete to complete table names as you're exploring a data source. 211 | 212 | ## Code of Conduct 213 | 214 | Please note that the 'dbcooper' project is released with a 215 | [Contributor Code of Conduct](https://github.com/pipeline-tools/dbcooper/blob/main/CODE_OF_CONDUCT.md). 216 | By contributing to this project, you agree to abide by its terms. 217 | -------------------------------------------------------------------------------- /dbcooper.Rproj: -------------------------------------------------------------------------------- 1 | Version: 1.0 2 | 3 | RestoreWorkspace: Default 4 | SaveWorkspace: Default 5 | AlwaysSaveHistory: Default 6 | 7 | EnableCodeIndexing: Yes 8 | UseSpacesForTab: Yes 9 | NumSpacesForTab: 2 10 | Encoding: UTF-8 11 | 12 | RnwWeave: Sweave 13 | LaTeX: pdfLaTeX 14 | 15 | BuildType: Package 16 | PackageUseDevtools: Yes 17 | PackageInstallArgs: --no-multiarch --with-keep.source 18 | -------------------------------------------------------------------------------- /man-roxygen/con-id.R: -------------------------------------------------------------------------------- 1 | #' @param con_id A short string describing a globally cached connection pool 2 | -------------------------------------------------------------------------------- /man/assign_table_function.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/init.R 3 | \name{assign_table_function} 4 | \alias{assign_table_function} 5 | \title{Create one table function} 6 | \usage{ 7 | assign_table_function( 8 | table_name, 9 | con_id, 10 | env = parent.frame(), 11 | table_formatter = snakecase::to_snake_case, 12 | table_post = identity 13 | ) 14 | } 15 | \arguments{ 16 | \item{table_name}{Name of the table} 17 | 18 | \item{con_id}{A short string describing a globally cached connection pool} 19 | 20 | \item{env}{Environment in which to create the table accessors, such 21 | as the global environment or a package namespace.} 22 | 23 | \item{table_formatter}{Optionally, a function to clean the table name 24 | before turning it into a function name, such as removing prefixes} 25 | 26 | \item{table_post}{Post-processing to perform on each table before 27 | returning it} 28 | } 29 | \description{ 30 | Create one table function 31 | } 32 | -------------------------------------------------------------------------------- /man/dbc_add_connection.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/connections.R 3 | \name{dbc_add_connection} 4 | \alias{dbc_add_connection} 5 | \title{Add a connection or pool to the global options} 6 | \usage{ 7 | dbc_add_connection(con, con_id) 8 | } 9 | \arguments{ 10 | \item{con}{Connection or pool object} 11 | 12 | \item{con_id}{A short string describing a globally cached connection pool} 13 | } 14 | \description{ 15 | dbcooper maintains a named list of connections or 16 | (recommended) pools. This internal function adds 17 | one. 18 | } 19 | -------------------------------------------------------------------------------- /man/dbc_clear_connections.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/connections.R 3 | \name{dbc_clear_connections} 4 | \alias{dbc_clear_connections} 5 | \title{Clear all connections created by dbc} 6 | \usage{ 7 | dbc_clear_connections() 8 | } 9 | \description{ 10 | Clear all connections created by dbc 11 | } 12 | -------------------------------------------------------------------------------- /man/dbc_execute.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/table_functions.R 3 | \name{dbc_execute} 4 | \alias{dbc_execute} 5 | \title{Execute a query on a SQL database} 6 | \usage{ 7 | dbc_execute(query, con_id) 8 | } 9 | \arguments{ 10 | \item{query}{Either a SQL query as a string, a file containing a SQL 11 | query, or a YAML file with a \code{sql} parameter.} 12 | 13 | \item{con_id}{A short string describing a globally cached connection pool} 14 | } 15 | \description{ 16 | Execute a query on a SQL database 17 | } 18 | -------------------------------------------------------------------------------- /man/dbc_get_connection.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/connections.R 3 | \name{dbc_get_connection} 4 | \alias{dbc_get_connection} 5 | \title{Retrieve a connection or pool from the global options} 6 | \usage{ 7 | dbc_get_connection(con_id) 8 | } 9 | \arguments{ 10 | \item{con_id}{A short string describing a globally cached connection pool} 11 | } 12 | \description{ 13 | dbcooper maintains a named list of connections or 14 | (recommended) pools. This internal function retrieves 15 | one. 16 | } 17 | -------------------------------------------------------------------------------- /man/dbc_init.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/init.R 3 | \name{dbc_init} 4 | \alias{dbc_init} 5 | \alias{dbc_init.default} 6 | \alias{dbc_init.src_sql} 7 | \title{Create functions for accessing a remote database} 8 | \usage{ 9 | dbc_init(con, con_id, env = parent.frame(), ...) 10 | 11 | \method{dbc_init}{default}( 12 | con, 13 | con_id, 14 | env = parent.frame(), 15 | tables = NULL, 16 | table_prefix = NULL, 17 | table_formatter = snakecase::to_snake_case, 18 | table_post = identity, 19 | ... 20 | ) 21 | 22 | \method{dbc_init}{src_sql}(con, con_id, env = parent.frame(), ...) 23 | } 24 | \arguments{ 25 | \item{con}{A DBI-compliant database connection object, or a 26 | \code{src_dbi}.} 27 | 28 | \item{con_id}{A short string that identifies the database. This 29 | is used to create functions \code{query_}, \code{tbl_} and 30 | \code{execute_} with appropriate names, as well as to cache the 31 | connection globally.} 32 | 33 | \item{env}{Environment in which to create the table accessors, such 34 | as the global environment or a package namespace.} 35 | 36 | \item{...}{Arguments passed on to \code{dbc_init.default}.} 37 | 38 | \item{tables}{Optionally, a vector of tables. Useful if dbcooper's 39 | table listing functions don't work for a database, or if you want to 40 | use only a subset of tables.} 41 | 42 | \item{table_prefix}{Optionally, a prefix to append to each table, 43 | usually a schema.} 44 | 45 | \item{table_formatter}{Optionally, a function to clean the table name 46 | before turning it into a function name, such as removing prefixes. 47 | By default, \code{\link[snakecase]{to_snake_case}}.} 48 | 49 | \item{table_post}{Optionally, post-processing to perform on each table before 50 | returning it.} 51 | } 52 | \description{ 53 | Create and assign functions that make accessing a database easy. 54 | These include \code{tbl_} functions for each table in the database, 55 | as well as \code{query_[id]} and \code{execute_[id]} functions for 56 | querying and executing SQL in the database. 57 | } 58 | \examples{ 59 | 60 | library(dplyr) 61 | library(dbplyr) 62 | 63 | # Initialize based on a SQL src or connection object 64 | src <- lahman_sqlite() 65 | dbc_init(src, "lahman") 66 | 67 | ## Tables 68 | 69 | # Access each table using autocompleted functions 70 | lahman_batting() 71 | 72 | # Can also pass the name of a table as a string to lahman_tbl 73 | lahman_tbl("Pitching") 74 | 75 | # Pass no argument to get a vector of all the tables 76 | lahman_list() 77 | 78 | # Run a SQL query 79 | lahman_query("SELECT COUNT(*) FROM Managers") 80 | 81 | # Execute queries that change the database 82 | lahman_execute("CREATE TABLE Players AS 83 | SELECT playerID, sum(AB) as AB FROM Batting GROUP BY playerID" 84 | ) 85 | 86 | lahman_tbl("Players") 87 | 88 | lahman_execute("DROP TABLE Players") 89 | 90 | } 91 | -------------------------------------------------------------------------------- /man/dbc_list_tables.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/utils.R 3 | \name{dbc_list_tables} 4 | \alias{dbc_list_tables} 5 | \alias{dbc_list_tables.default} 6 | \alias{dbc_list_tables.PqConnection} 7 | \alias{dbc_list_tables.Snowflake} 8 | \alias{dbc_list_tables.SnowflakeDBConnection} 9 | \alias{dbc_list_tables.duckdb_connection} 10 | \title{List tables in a database} 11 | \usage{ 12 | dbc_list_tables(con, exclude_schemas) 13 | 14 | \method{dbc_list_tables}{default}(con, exclude_schemas = c("information_schema", "pg_catalog")) 15 | 16 | \method{dbc_list_tables}{PqConnection}(con, exclude_schemas = c("information_schema", "pg_catalog")) 17 | 18 | \method{dbc_list_tables}{Snowflake}(con, exclude_schemas = c("INFORMATION_SCHEMA")) 19 | 20 | \method{dbc_list_tables}{SnowflakeDBConnection}(con, exclude_schemas = c("INFORMATION_SCHEMA")) 21 | 22 | \method{dbc_list_tables}{duckdb_connection}(con, exclude_schemas = c("INFORMATION_SCHEMA")) 23 | } 24 | \arguments{ 25 | \item{con}{A connection or pool object} 26 | 27 | \item{exclude_schemas}{Schemas for which no tables should be returned. 28 | The default excludes information_schema and pg_catalog, which typically 29 | contain database metadata.} 30 | } 31 | \description{ 32 | dbListTables doesn't work for all types of databases, especially 33 | ones that have schemas. We use this approach instead. 34 | Work in progress as we test across different database types. 35 | } 36 | -------------------------------------------------------------------------------- /man/dbc_param.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/utils.R 3 | \name{dbc_param} 4 | \alias{dbc_param} 5 | \title{Require a parameter by name} 6 | \usage{ 7 | dbc_param(param_name) 8 | } 9 | \arguments{ 10 | \item{param_name}{Parameter to look up with \code{\link[base]{Sys.getenv}}.} 11 | } 12 | \description{ 13 | Throws an error if the parameter is not set. 14 | } 15 | -------------------------------------------------------------------------------- /man/dbc_query.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/table_functions.R 3 | \name{dbc_query} 4 | \alias{dbc_query} 5 | \title{Run a query on a SQL database and get a remote table back} 6 | \usage{ 7 | dbc_query(query, con_id) 8 | } 9 | \arguments{ 10 | \item{query}{Either a SQL query as a string, a file containing a SQL 11 | query, or a YAML file with a \code{sql} parameter.} 12 | 13 | \item{con_id}{A short string describing a globally cached connection pool} 14 | } 15 | \value{ 16 | A \code{tbl_sql} with the remote results 17 | } 18 | \description{ 19 | Run a query on a SQL database and get a remote table back 20 | } 21 | -------------------------------------------------------------------------------- /man/dbc_table.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/table_functions.R 3 | \name{dbc_table} 4 | \alias{dbc_table} 5 | \title{Access a single table} 6 | \usage{ 7 | dbc_table(table_name = NULL, con_id) 8 | } 9 | \arguments{ 10 | \item{table_name}{Name of the table in the database. If no table is 11 | provided, returns a list of tables as a character vector.} 12 | 13 | \item{con_id}{A short string describing a globally cached connection pool} 14 | } 15 | \description{ 16 | Access a single table 17 | } 18 | -------------------------------------------------------------------------------- /man/query_from_str.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/table_functions.R 3 | \name{query_from_str} 4 | \alias{query_from_str} 5 | \title{Given a string, turn it into a SQL query} 6 | \usage{ 7 | query_from_str(query) 8 | } 9 | \arguments{ 10 | \item{query}{A string or filename} 11 | } 12 | \description{ 13 | Internal function to pull a query from a string. If the string is in a 14 | YAML or plain text file, read it 15 | } 16 | -------------------------------------------------------------------------------- /man/set_option.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/utils.R 3 | \name{set_option} 4 | \alias{set_option} 5 | \title{Set an option by name} 6 | \usage{ 7 | set_option(option_name, value) 8 | } 9 | \arguments{ 10 | \item{option_name}{Name} 11 | 12 | \item{value}{Value} 13 | } 14 | \description{ 15 | Utility. This is a bit difficult to do with the options() function. 16 | } 17 | -------------------------------------------------------------------------------- /tests/testthat.R: -------------------------------------------------------------------------------- 1 | library(testthat) 2 | library(dbcooper) 3 | 4 | test_check("dbcooper") 5 | -------------------------------------------------------------------------------- /tests/testthat/test-connections.R: -------------------------------------------------------------------------------- 1 | test_that("connection not found", { 2 | expect_error(dbc_get_connection("nonexistant"), "Connection not found") 3 | }) 4 | 5 | 6 | test_that("connection added", { 7 | conn <- DBI::dbConnect(RSQLite::dbDriver("SQLite"), ":memory:") 8 | dbc_add_connection(conn, "added_conn") 9 | expect_equal(dbc_get_connection("added_conn"), conn) 10 | }) 11 | 12 | 13 | test_that("connections cleared", { 14 | conn <- DBI::dbConnect(RSQLite::dbDriver("SQLite"), ":memory:") 15 | dbc_add_connection(conn, "dropped_conn") 16 | expect_equal(dbc_get_connection("dropped_conn"), conn) 17 | dbc_clear_connections() 18 | expect_error(dbc_get_connection("dropped_conn"), "Connection not found") 19 | }) 20 | -------------------------------------------------------------------------------- /tests/testthat/test-init.R: -------------------------------------------------------------------------------- 1 | test_that("init inits", { 2 | src <- dbplyr::lahman_sqlite() 3 | dbc_init(src, "lahman") 4 | 5 | expect_true("tbl_lazy" %in% class(lahman_appearances())) 6 | 7 | }) 8 | -------------------------------------------------------------------------------- /tests/testthat/test-utils.R: -------------------------------------------------------------------------------- 1 | test_that("param not set", { 2 | expect_error(dbc_param("nonexistant"), "You must set nonexistant in your .Renviron \\(then restart R\\)") 3 | }) 4 | 5 | 6 | test_that("option is set", { 7 | set_option("my_option_name", "my_option_value") 8 | expect_equal(getOption("my_option_name"), "my_option_value") 9 | }) 10 | --------------------------------------------------------------------------------