├── pics ├── logs.png └── monitoring.png ├── demoDockerAppEngine ├── cron.yaml ├── app.yaml ├── Dockerfile ├── openapi.yaml ├── api.R └── README.md ├── endpointsR.Rproj ├── .gitignore ├── LICENSE ├── README.md └── README.Rmd /pics/logs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MarkEdmondson1234/serverless-R-API-appengine/HEAD/pics/logs.png -------------------------------------------------------------------------------- /demoDockerAppEngine/cron.yaml: -------------------------------------------------------------------------------- 1 | cron: 2 | - description: "test cron" 3 | url: /demoR 4 | schedule: every 1 hours 5 | -------------------------------------------------------------------------------- /pics/monitoring.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MarkEdmondson1234/serverless-R-API-appengine/HEAD/pics/monitoring.png -------------------------------------------------------------------------------- /endpointsR.Rproj: -------------------------------------------------------------------------------- 1 | Version: 1.0 2 | 3 | RestoreWorkspace: Default 4 | SaveWorkspace: Default 5 | AlwaysSaveHistory: Default 6 | 7 | EnableCodeIndexing: Yes 8 | UseSpacesForTab: Yes 9 | NumSpacesForTab: 2 10 | Encoding: UTF-8 11 | 12 | RnwWeave: Sweave 13 | LaTeX: pdfLaTeX 14 | -------------------------------------------------------------------------------- /demoDockerAppEngine/app.yaml: -------------------------------------------------------------------------------- 1 | runtime: custom 2 | env: flex 3 | automatic_scaling: 4 | min_num_instances: 1 5 | max_num_instances: 1 6 | 7 | resources: 8 | cpu: 1 9 | memory_gb: 2 10 | 11 | env_variables: 12 | GCS_AUTH_FILE: auth.json 13 | 14 | endpoints_api_service: 15 | name: mark-edmondson-usa.appspot.com 16 | config_id: 2017-08-03r1 -------------------------------------------------------------------------------- /demoDockerAppEngine/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM gcr.io/gcer-public/plumber-appengine 2 | LABEL maintainer="mark" 3 | # RUN export DEBIAN_FRONTEND=noninteractive; apt-get -y update \ 4 | # && apt-get install -y 5 | 6 | # RUN ["install2.r", "-r 'https://cloud.r-project.org'", ""] 7 | # RUN ["installGithub.r", ""] 8 | 9 | WORKDIR /payload/ 10 | COPY [".", "./"] 11 | 12 | CMD ["api.R"] -------------------------------------------------------------------------------- /demoDockerAppEngine/openapi.yaml: -------------------------------------------------------------------------------- 1 | swagger: '2.0' 2 | info: 3 | description: API Description 4 | title: API Title 5 | version: 1.0.0 6 | host: mark-edmondson-usa.appspot.com 7 | schemes: [http] 8 | produces: [application/json] 9 | paths: 10 | /demoR: 11 | get: 12 | summary: '' 13 | responses: 14 | default: 15 | description: Default response. 16 | parameters: [] 17 | operationId: demoR 18 | 19 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # History files 2 | .Rhistory 3 | .Rapp.history 4 | 5 | # Session Data files 6 | .RData 7 | 8 | # Example code in package build process 9 | *-Ex.R 10 | 11 | # Output files from R CMD build 12 | /*.tar.gz 13 | 14 | # Output files from R CMD check 15 | /*.Rcheck/ 16 | 17 | # RStudio files 18 | .Rproj.user/ 19 | 20 | # produced vignettes 21 | vignettes/*.html 22 | vignettes/*.pdf 23 | 24 | # OAuth2 token, see https://github.com/hadley/httr/releases/tag/v0.3 25 | .httr-oauth 26 | 27 | # knitr and R markdown default cache directories 28 | /*_cache/ 29 | /cache/ 30 | 31 | # Temporary files created by R markdown 32 | *.utf8.md 33 | *.knit.md 34 | .Rproj.user 35 | 36 | # auth files 37 | auth.json -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2017 Mark 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /demoDockerAppEngine/api.R: -------------------------------------------------------------------------------- 1 | library(googleAuthR) ## authentication 2 | library(googleCloudStorageR) ## google cloud storage 3 | library(readr) ## 4 | ## gcs auto authenticated via environment file 5 | ## pointed to via sys.env GCS_AUTH_FILE 6 | 7 | #* @get /demoR 8 | demoScheduleAPI <- function(){ 9 | 10 | ## download or do something 11 | something <- tryCatch({ 12 | gcs_get_object("schedule/test.csv", 13 | bucket = "mark-edmondson-public-files") 14 | }, error = function(ex) { 15 | NULL 16 | }) 17 | 18 | something_else <- data.frame(X1 = 1, 19 | time = Sys.time(), 20 | blah = paste(sample(letters, 10, replace = TRUE), collapse = "")) 21 | something <- rbind(something, something_else) 22 | 23 | tmp <- tempfile(fileext = ".csv") 24 | on.exit(unlink(tmp)) 25 | write.csv(something, file = tmp, row.names = FALSE) 26 | ## upload something 27 | gcs_upload(tmp, 28 | bucket = "mark-edmondson-public-files", 29 | name = "schedule/test.csv") 30 | 31 | cat("Done", Sys.time()) 32 | } 33 | 34 | ## run locally via 35 | # pr <- plumber::plumb("schedule.R"); pr$run(port=8080) -------------------------------------------------------------------------------- /demoDockerAppEngine/README.md: -------------------------------------------------------------------------------- 1 | # README 2 | 3 | 0. Create a Google Appengine project in the US region (only region that supports flexible containers at the moment) 4 | 1. Create a scheduled script e.g. `schedule.R` - you can use auth from environment files specified in `app.yaml`. 5 | 2. Make an API out of the script by using `plumber` - example: 6 | 7 | ```r 8 | library(googleAuthR) ## authentication 9 | library(googleCloudStorageR) ## google cloud storage 10 | library(readr) ## 11 | ## gcs auto authenticated via environment file 12 | ## pointed to via sys.env GCS_AUTH_FILE 13 | 14 | #* @get /demoR 15 | demoScheduleAPI <- function(){ 16 | 17 | ## download or do something 18 | something <- tryCatch({ 19 | gcs_get_object("schedule/test.csv", 20 | bucket = "mark-edmondson-public-files") 21 | }, error = function(ex) { 22 | NULL 23 | }) 24 | 25 | something_else <- data.frame(X1 = 1, 26 | time = Sys.time(), 27 | blah = paste(sample(letters, 10, replace = TRUE), collapse = "")) 28 | something <- rbind(something, something_else) 29 | 30 | tmp <- tempfile(fileext = ".csv") 31 | on.exit(unlink(tmp)) 32 | write.csv(something, file = tmp, row.names = FALSE) 33 | ## upload something 34 | gcs_upload(tmp, 35 | bucket = "mark-edmondson-public-files", 36 | name = "schedule/test.csv") 37 | 38 | cat("Done", Sys.time()) 39 | } 40 | 41 | ``` 42 | 43 | 3. Create Dockerfile. If using `containerit` then replace FROM with `trestletech/plumber` and add the below lines to use correct AppEngine port: 44 | 45 | Example: 46 | 47 | ```r 48 | library(containerit) 49 | 50 | dockerfile <- dockerfile("schedule.R", copy = "script_dir", soft = TRUE) 51 | write(dockerfile, file = "Dockerfile") 52 | ``` 53 | 54 | Then change/add these lines: 55 | 56 | ``` 57 | EXPOSE 8080 58 | ENTRYPOINT ["R", "-e", "pr <- plumber::plumb(commandArgs()[4]); pr$run(host='0.0.0.0', port=8080)"] 59 | CMD ["schedule.R"] 60 | ``` 61 | 62 | Example output: 63 | 64 | ``` 65 | FROM trestletech/plumber 66 | LABEL maintainer="mark" 67 | RUN export DEBIAN_FRONTEND=noninteractive; apt-get -y update \ 68 | && apt-get install -y libcairo2-dev \ 69 | libcurl4-openssl-dev \ 70 | libgmp-dev \ 71 | libpng-dev \ 72 | libssl-dev \ 73 | libxml2-dev \ 74 | make \ 75 | pandoc \ 76 | pandoc-citeproc \ 77 | zlib1g-dev 78 | RUN ["install2.r", "-r 'https://cloud.r-project.org'", "readr", "googleCloudStorageR", "Rcpp", "digest", "crayon", "withr", "mime", "R6", "jsonlite", "xtable", "magrittr", "httr", "curl", "testthat", "devtools", "hms", "shiny", "httpuv", "memoise", "htmltools", "openssl", "tibble", "remotes"] 79 | RUN ["installGithub.r", "MarkEdmondson1234/googleAuthR@7917351", "hadley/rlang@ff87439"] 80 | WORKDIR /payload/ 81 | COPY [".", "./"] 82 | 83 | EXPOSE 8080 84 | ENTRYPOINT ["R", "-e", "pr <- plumber::plumb(commandArgs()[4]); pr$run(host='0.0.0.0', port=8080)"] 85 | CMD ["schedule.R"] 86 | ``` 87 | 88 | 89 | 3. Specify `app.yaml` with any environment vars such as auth files, that will be included in same folder. Also limit instances if needed to make sure errors down't spawn hundreds of VMs. 90 | 91 | Example: 92 | 93 | ```yaml 94 | runtime: custom 95 | env: flex 96 | 97 | env_variables: 98 | GCS_AUTH_FILE: auth.json 99 | ``` 100 | 101 | 4. Specify `cron.yaml` for the schedule needed: 102 | 103 | ```yaml 104 | cron: 105 | - description: "test cron" 106 | url: /demoR 107 | schedule: every 1 hours 108 | ``` 109 | 110 | 5. Deploy via `gcloud app deploy --project your-project` 111 | 112 | Deploy new cron via `gcloud app deploy cron.yaml --project your-project` 113 | 114 | 6. App should then be deployed on https://your-project.appspot.com/ - every GET request to https://your-project.appspot.com/demoR (or other endpoints you have specified in R script) will run the R code. The cron will run every hour to this endpoint as specified by the cron file above. Logs for the instance are found [here](https://console.cloud.google.com/logs/viewer). 115 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Creating a serverless R API on Google Cloud Platform 2 | ================ 3 | Mark Edmondson 4 | 8/3/2017 5 | 6 | R + Docker + containerit + Google Build triggers + GitHub + Plumber + Swagger (OpenAPI) + Flexible Custom runtime App Engine + Google Cloud Endpoints = a serverless, scalable R API, that can be called by non-R SDKs, built in OAuth2 and auth keys and monitoring 7 | 8 | Deploy to App Engine 9 | -------------------- 10 | 11 | 1. Install [plumber](https://www.rplumber.io/), [containerit](https://github.com/o2r-project/containerit), and the python [gcloud SDK](https://cloud.google.com/sdk/downloads) 12 | 13 | Below is taken from the guide [here](demoDockerAppEngine/README.md), which has more detail. 14 | 15 | 1. Create a plumber R script with endpoints for your R code 16 | 2. Use `library(containerit)` to create a Dockerfile of the dependents your R code needs 17 | 3. Alter the generated Dockerfile so it works on App Engine as [detailed there](demoDockerAppEngine/README.md) 18 | 4. You can use `FROM gcr.io/gcer-public/plumber-appengine` to speed up build times as its preinstalled plumber. It also has [`googleCloudStorageR`](https://github.com/cloudyr/googleCloudStorageR) so you can call data into your scripts, but in those cases you will need to also supply an auth.json 19 | 20 | `gcr.io/gcer-public/plumber-appengine` is a [publicly built Docker image](https://cloudyr.github.io/googleComputeEngineR/articles/docker.html#public-docker-images) from the `googleComputeEngineR` project, but you can make your own private one if you like and use [Build Triggers](https://cloud.google.com/container-builder/docs/how-to/build-triggers) to create it. 21 | 22 | In any case, the Dockerfile is much simpler: 23 | 24 | FROM gcr.io/gcer-public/plumber-appengine 25 | LABEL maintainer="mark" 26 | 27 | ## uncomment as needed 28 | # RUN export DEBIAN_FRONTEND=noninteractive; apt-get -y update \ 29 | # && apt-get install -y 30 | 31 | ## uncomment as needed 32 | # RUN ["install2.r", "-r 'https://cloud.r-project.org'", ""] 33 | # RUN ["installGithub.r", ""] 34 | 35 | WORKDIR /payload/ 36 | COPY [".", "./"] 37 | 38 | CMD ["api.R"] 39 | 40 | 1. Configure App Engine's `app.yaml` 41 | 42 | Example: 43 | 44 | ``` yaml 45 | runtime: custom 46 | env: flex 47 | 48 | env_variables: 49 | GCS_AUTH_FILE: auth.json 50 | ``` 51 | 52 | If using `googleCloudStorageR` in your script, you will need to include your own `auth.json` service JSON key from your Google Cloud project in the folder you upload. 53 | 54 | 1. Upload to a USA based App Engine via `gcloud app deploy --project your-project` 55 | 2. Your R code is now deployed as a Plumber app 56 | 57 | The App Engine is now up and running with your plumber powered R API. It will auto scale as more connections are added as you configure it in the `app.yaml` - [reference](https://cloud.google.com/appengine/docs/standard/python/config/appref) 58 | 59 | Deploying Cloud Endpoints 60 | ------------------------- 61 | 62 | We now enable Google Cloud Endpoints [following this guide](https://cloud.google.com/endpoints/docs/get-started-app-engine#deploy_configuration). 63 | 64 | 1. Plumber generates a swagger.json file that is available at `https://your-project.appspot.com/swagger.json` or if you run it locally via `http://127.0.0.1:8080/swagger.json` 65 | 2. Run the function below to generate the openapi.yaml for Google cloud endpoints 66 | 67 | ``` r 68 | library(yaml) 69 | library(jsonlite) 70 | 71 | make_openapi <- function(projectId){ 72 | json <- jsonlite::fromJSON(sprintf("https://%s.appspot.com/swagger.json", projectId)) 73 | json$host <- sprintf("%s.appspot.com", projectId) 74 | 75 | ## add operationId to each endpoint 76 | ohgod <- lapply(names(json$paths), 77 | function(x) {lapply(json$paths[[x]], 78 | function(verb) {verb$operationId <- basename(x);verb})}) 79 | json$paths <- setNames(ohgod, names(json$paths)) 80 | 81 | # silly formatting 82 | yaml <- gsub("application/json", "[application/json]", yaml::as.yaml(json)) 83 | yaml <- gsub("schemes: http", "schemes: [http]", yaml) 84 | 85 | writeLines(yaml, con = "openapi.yaml") 86 | } 87 | 88 | make_openapi("your-project-id") 89 | ``` 90 | 91 | ([issue raised with plumber](https://github.com/trestletech/plumber/issues/154) to see if this can be handled by the library) 92 | 93 | 1. Deploy the openapi.yaml in terminal via `gcloud service-management deploy openapi.yaml --project your-project` 94 | 2. Run `gcloud service-management configs list --service=your-project.appspot.com` to see the service management name and config you just uploaded 95 | 3. Add these lines to the `app.yaml` of the app engine to include the info you got from the listing: 96 | 97 | 98 | 99 | endpoints_api_service: 100 | # The following values are to be replaced by information from the output of 101 | # 'gcloud service-management deploy openapi.yaml' command. 102 | name: ENDPOINTS-SERVICE-NAME 103 | config_id: ENDPOINTS-CONFIG-ID 104 | 105 | Save, deploy the app again via `gcloud app deploy --project your-project` 106 | 107 | Once deployed the `/swagger.json` won't be available as its not in the API spec. 108 | 109 | ### Check it 110 | 111 | You should now see monitoring and logs 112 | 113 | ![](pics/logs.png) 114 | 115 | ![](pics/monitoring.png) 116 | 117 | Going further 118 | ------------- 119 | 120 | You can now play around with Cloud endpoints features by modifying the configuration files. 121 | 122 | - [Alter automatic scaling](https://cloud.google.com/appengine/docs/flexible/nodejs/configuring-your-app-with-app-yaml#automatic_scaling) to determine when and how to launch new instances to cover traffic. 123 | - [Restrict access](https://cloud.google.com/endpoints/docs/api-access-overview) via roles, API keys, OAuth2 etc. 124 | - [Publicise the API](https://cloud.google.com/endpoints/docs/control-api-callers) so other users can use it in their own projects 125 | - [Generate Client library bundles in Java or python](https://cloud.google.com/endpoints/docs/frameworks/python/gen_clients) 126 | - [Integrate R endpoints with other endpoints written in different langauges](https://cloudplatform.googleblog.com/2016/06/creating-a-scalable-API-with-microservices.html) 127 | 128 | Pricing 129 | ------- 130 | 131 | - Cloud Endpoints: 0-2Million calls - Free, 2M+ - $3 per million API calls 132 | - App Engine costs - [Flexible App engine pricing](https://cloud.google.com/appengine/pricing#flexible-environment-instances) - depends largely on how many instances you spawn per X connections. 133 | 134 | If you have peaks of traffic that spawn more instances then idle periods, its the total number of instance hours that count (e.g. a peak of one hour that launches 24 instances will cost the same as 24 hours with constant traffic that needs only one instance.) You determine how large these instances are and when they spawn (normally when they hit 50% of CPU) so it could be cheaper to have one large instance rather than two small ones. 135 | 136 | The automatic scaling and resources of each instance will be the largest determination of cost. Use the monitoring to get the latency of the API and configure the `app.yaml` accordingly to get the performance you require, which will determine when extra instances are launched running your R code underneath. For example, if you ran the default auto scaling with the [default resources](https://cloud.google.com/appengine/docs/flexible/nodejs/configuring-your-app-with-app-yaml#resource-settings) (2 instances with 1 CPU core, 0.6GB) and you have enough API traffic for 24 hours of constant API useage, it will cost [$2.73 per day](https://cloud.google.com/products/calculator/#id=44336162-27f9-41b7-94a4-73036357e6d6). 137 | 138 | Make sure to put billing alerts and maximum spend on your app engine to avoid big charges, I typically put a $1 a day limit on App Engine when testing just to make sure nothing huge can go wrong with a misconfiguration of something. 139 | -------------------------------------------------------------------------------- /README.Rmd: -------------------------------------------------------------------------------- 1 | --- 2 | title: "Creating a serverless R API on Google Cloud Platform" 3 | author: "Mark Edmondson" 4 | date: "8/3/2017" 5 | output: github_document 6 | --- 7 | 8 | ```{r setup, include=FALSE} 9 | knitr::opts_chunk$set(echo = TRUE) 10 | ``` 11 | 12 | R + Docker + containerit + Google Build triggers + GitHub + Plumber + Swagger (OpenAPI) + Flexible Custom runtime App Engine + Google Cloud Endpoints = a serverless, scalable R API, that can be called by non-R SDKs, built in OAuth2 and auth keys and monitoring 13 | 14 | ## Deploy to App Engine 15 | 16 | 0. Install [plumber](https://www.rplumber.io/), [containerit](https://github.com/o2r-project/containerit), and the python [gcloud SDK](https://cloud.google.com/sdk/downloads) 17 | 18 | Below is taken from the guide [here](demoDockerAppEngine/README.md), which has more detail. 19 | 20 | 1. Create a plumber R script with endpoints for your R code 21 | 2. Use `library(containerit)` to create a Dockerfile of the dependents your R code needs 22 | 3. Alter the generated Dockerfile so it works on App Engine as [detailed there](demoDockerAppEngine/README.md) 23 | 4. You can use `FROM gcr.io/gcer-public/plumber-appengine` to speed up build times as its preinstalled plumber. It also has [`googleCloudStorageR`](https://github.com/cloudyr/googleCloudStorageR) so you can call data into your scripts, but in those cases you will need to also supply an auth.json 24 | 25 | `gcr.io/gcer-public/plumber-appengine` is a [publicly built Docker image](https://cloudyr.github.io/googleComputeEngineR/articles/docker.html#public-docker-images) from the `googleComputeEngineR` project, but you can make your own private one if you like and use [Build Triggers](https://cloud.google.com/container-builder/docs/how-to/build-triggers) to create it. 26 | 27 | In any case, the Dockerfile is much simpler: 28 | 29 | ``` 30 | FROM gcr.io/gcer-public/plumber-appengine 31 | LABEL maintainer="mark" 32 | 33 | ## uncomment as needed 34 | # RUN export DEBIAN_FRONTEND=noninteractive; apt-get -y update \ 35 | # && apt-get install -y 36 | 37 | ## uncomment as needed 38 | # RUN ["install2.r", "-r 'https://cloud.r-project.org'", ""] 39 | # RUN ["installGithub.r", ""] 40 | 41 | WORKDIR /payload/ 42 | COPY [".", "./"] 43 | 44 | CMD ["api.R"] 45 | ``` 46 | 47 | 4. Configure App Engine's `app.yaml` 48 | 49 | Example: 50 | 51 | ```yaml 52 | runtime: custom 53 | env: flex 54 | 55 | env_variables: 56 | GCS_AUTH_FILE: auth.json 57 | ``` 58 | 59 | If using `googleCloudStorageR` in your script, you will need to include your own `auth.json` service JSON key from your Google Cloud project in the folder you upload. 60 | 61 | 5. Upload to a USA based App Engine via `gcloud app deploy --project your-project` 62 | 6. Your R code is now deployed as a Plumber app https://your-project.appspot.com/ 63 | 64 | The App Engine is now up and running with your plumber powered R API. It will auto scale as more connections are added as you configure it in the `app.yaml` - [reference](https://cloud.google.com/appengine/docs/standard/python/config/appref) 65 | 66 | ## Deploying Cloud Endpoints 67 | 68 | We now enable Google Cloud Endpoints [following this guide](https://cloud.google.com/endpoints/docs/get-started-app-engine#deploy_configuration). 69 | 70 | 6. Plumber generates a swagger.json file that is available at `https://your-project.appspot.com/swagger.json` or if you run it locally via `http://127.0.0.1:8080/swagger.json` 71 | 7. Run the function below to generate the openapi.yaml for Google cloud endpoints 72 | 73 | ```r 74 | library(yaml) 75 | library(jsonlite) 76 | 77 | make_openapi <- function(projectId){ 78 | json <- jsonlite::fromJSON(sprintf("https://%s.appspot.com/swagger.json", projectId)) 79 | json$host <- sprintf("%s.appspot.com", projectId) 80 | 81 | ## add operationId to each endpoint 82 | ohgod <- lapply(names(json$paths), 83 | function(x) {lapply(json$paths[[x]], 84 | function(verb) {verb$operationId <- basename(x);verb})}) 85 | json$paths <- setNames(ohgod, names(json$paths)) 86 | 87 | # silly formatting 88 | yaml <- gsub("application/json", "[application/json]", yaml::as.yaml(json)) 89 | yaml <- gsub("schemes: http", "schemes: [http]", yaml) 90 | 91 | writeLines(yaml, con = "openapi.yaml") 92 | } 93 | 94 | make_openapi("your-project-id") 95 | 96 | ``` 97 | ([issue raised with plumber](https://github.com/trestletech/plumber/issues/154) to see if this can be handled by the library) 98 | 99 | 8. Deploy the openapi.yaml in terminal via `gcloud service-management deploy openapi.yaml --project your-project` 100 | 9. Run `gcloud service-management configs list --service=your-project.appspot.com` to see the service management name and config you just uploaded 101 | 10. Add these lines to the `app.yaml` of the app engine to include the info you got from the listing: 102 | 103 | ``` 104 | endpoints_api_service: 105 | # The following values are to be replaced by information from the output of 106 | # 'gcloud service-management deploy openapi.yaml' command. 107 | name: ENDPOINTS-SERVICE-NAME 108 | config_id: ENDPOINTS-CONFIG-ID 109 | ``` 110 | 111 | Save, deploy the app again via `gcloud app deploy --project your-project` 112 | 113 | Once deployed the `/swagger.json` won't be available as its not in the API spec. 114 | 115 | ### Check it 116 | 117 | You should now see monitoring and logs 118 | 119 | ![](pics/logs.png) 120 | 121 | ![](pics/monitoring.png) 122 | 123 | 124 | ## Going further 125 | 126 | You can now play around with Cloud endpoints features by modifying the configuration files. 127 | 128 | - [Alter automatic scaling](https://cloud.google.com/appengine/docs/flexible/nodejs/configuring-your-app-with-app-yaml#automatic_scaling) to determine when and how to launch new instances to cover traffic. 129 | - [Restrict access](https://cloud.google.com/endpoints/docs/api-access-overview) via roles, API keys, OAuth2 etc. 130 | - [Publicise the API](https://cloud.google.com/endpoints/docs/control-api-callers) so other users can use it in their own projects 131 | - [Generate Client library bundles in Java or python](https://cloud.google.com/endpoints/docs/frameworks/python/gen_clients) 132 | - [Integrate R endpoints with other endpoints written in different langauges](https://cloudplatform.googleblog.com/2016/06/creating-a-scalable-API-with-microservices.html) 133 | 134 | ## Pricing 135 | 136 | * Cloud Endpoints: 0-2Million calls - Free, 2M+ - $3 per million API calls 137 | * App Engine costs - [Flexible App engine pricing](https://cloud.google.com/appengine/pricing#flexible-environment-instances) - depends largely on how many instances you spawn per X connections. 138 | 139 | If you have peaks of traffic that spawn more instances then idle periods, its the total number of instance hours that count (e.g. a peak of one hour that launches 24 instances will cost the same as 24 hours with constant traffic that needs only one instance.) You determine how large these instances are and when they spawn (normally when they hit 50% of CPU) so it could be cheaper to have one large instance rather than two small ones. 140 | 141 | The automatic scaling and resources of each instance will be the largest determination of cost. Use the monitoring to get the latency of the API and configure the `app.yaml` accordingly to get the performance you require, which will determine when extra instances are launched running your R code underneath. For example, if you ran the default auto scaling with the [default resources](https://cloud.google.com/appengine/docs/flexible/nodejs/configuring-your-app-with-app-yaml#resource-settings) (2 instances with 1 CPU core, 0.6GB) and you have enough API traffic for 24 hours of constant API useage, it will cost [$2.73 per day](https://cloud.google.com/products/calculator/#id=44336162-27f9-41b7-94a4-73036357e6d6). 142 | 143 | Make sure to put billing alerts and maximum spend on your app engine to avoid big charges, I typically put a $1 a day limit on App Engine when testing just to make sure nothing huge can go wrong with a misconfiguration of something. 144 | 145 | 146 | --------------------------------------------------------------------------------