` to `bytes::Bytes`. This should not break existing applications but dynomite users now get transparent support for Items which declare fields of type `byte::Bytes`, which will be interpreted the same opaque binary blob of bytes, for free.
82 |
83 | # 0.4.1
84 |
85 | * added a new `rustls` feature flag which when enabled replaces openssl with `rustls` [#54](https://github.com/softprops/dynomite/pull/55)
86 |
87 | # 0.4.0
88 |
89 | * Upgrade to latest rusoto version [`0.38.0`](https://github.com/rusoto/rusoto/blob/master/CHANGELOG.md#0380---2019-04-17)
90 |
91 | # 0.3.0
92 |
93 | * Upgrade to latest rusoto version ([`0.37.0`](https://github.com/rusoto/rusoto/blob/master/CHANGELOG.md#0370---2019-03-12)) with added support for new DynamoDB methods `describe_endpoints`, `transact_get_items`, and `transact_write_items`.
94 | * Upgrading to the latest rusoto means that clients are Cloneable. As such, `Arc` restrictions are removed on stream-based auto-pagination interfaces.
95 |
96 | # 0.2.1
97 |
98 | * Add support for configuring policies for retrying requests [based on DynamoDB recommendations](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Programming.Errors.html)
99 |
100 |
101 | ```rust
102 | use dynomite::{Retries, retry::Policy};
103 | use dynomite::dynamodb::{DynamoDb, DynamoDbClient};
104 |
105 | fn main() {
106 | let client =
107 | DynamoDbClient::new(Default::default())
108 | .with_retries(Policy::default());
109 |
110 | // any client operation will now be retried when
111 | // appropriate
112 | let tables = client.list_tables(Default::default());
113 | // other important work...
114 | }
115 | ```
116 |
117 | * update documentation to highlight more concisely areas of focus
118 |
119 | # 0.2.0
120 |
121 | * upgraded to 2018 edition
122 | * a side effect of this is that an interaction with 2018-style imports caused a name conflict with `dynomite::Item` and now `dynomite_derive::Item`. As a result the dynomite crate now has a
123 | compiler feature flag called "derive" which is no by default that resolves this. If you do not wish to have the feature enabled by default add the following to your Cargo.toml
124 |
125 | ```toml
126 | [dependencies.dynomite]
127 | version = "0.2"
128 | default-features = false
129 | features = ["uuid"]
130 | ```
131 | * updates to supported Attribute type conversions
132 |
133 | * numeric sets (NS) no longer support vec type conversions, only sets types!
134 | * list types (L) now support any type that implements `Attribute`, previously this only
135 | supported lists of types that implemented `Item` (a complex time). This means lists of scalars are now supported by default
136 | * `Cow` is now supported for String Attributes
137 | * `FromAttributes` is now implemented for `XXXMap` types of `String` to `Attribute` types.
138 | This means you now get free, Item-link integration for homogenious maps
139 | * much needed unit tests now cover the correctness of implementations!
140 | * (breaking change) the `DynamoDbExt.stream_xxx` methods which produced auto-paginating streams have been renamed to `DynamoDbExt.xxx_pages` to be more intention-revealing and inline with naming conventions of other language sdk's methods that implement similar functionality.
141 |
142 | # 0.1.5
143 |
144 | * updated dependencies
145 |
146 | * `Rusoto-*` 0.34 -> 0.36
147 |
148 | # 0.1.4
149 |
150 | * add Stream oriented extension interfaces for paginated apis
151 |
152 | By default, the `DyanomoDb` apis `list_backups`, `list_tables`, `query`, `scan`
153 | all require application management of pagination using inconsistent api's.
154 | This release brings a consistent interface for each with extension methods prefixed with `stream_`
155 | which return a consistent interface for retrieving a `futures::Stream` of their
156 | respective values.
157 |
158 | * add `maplit!` inspired `attr_map!` helper macro useful in query contexts when providing `expression_attribute_values`
159 |
160 | * pin rusoto crate versioning to minor release `0.34`
161 |
162 | In the past this crate was pinned to a major version of rusoto. It will be pinned to a minor
163 | version going forward.
164 |
165 | See the [demo application](https://github.com/softprops/dynomite/blob/5ed3444a46a02bd560644fed35adb553ffb8a0f0/dynomite-derive/examples/demo.rs) for examples of updated interfaces.
166 |
167 | # 0.1.3
168 |
169 | * fix examples for rusoto breaking changes in 0.32, async release
170 |
171 | # 0.1.2
172 |
173 | * fix `dynomite-derive` `dynomite` dependency version
174 |
175 | # 0.1.1
176 |
177 | * initial release
178 |
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | # Contributing
2 |
3 | ## Filing an Issue
4 |
5 | If you are trying to use `dynomite` and run into an issue- please file an
6 | issue! We'd love to get you up and running, even if the issue you have might
7 | not be directly related to the code in `dynomite`. This library seeks to make
8 | it easy for developers to get going, so there's a good chance we can do
9 | something to alleviate the issue by making `dynomite` better documented or
10 | more robust to different developer environments.
11 |
12 | When filing an issue, do your best to be as specific as possible
13 | The faster was can reproduce your issue, the faster we
14 | can fix it for you!
15 |
16 | ## Submitting a PR
17 |
18 | If you are considering filing a pull request, make sure that there's an issue
19 | filed for the work you'd like to do. There might be some discussion required!
20 | Filing an issue first will help ensure that the work you put into your pull
21 | request will get merged :)
22 |
23 | Before you submit your pull request, check that you have completed all of the
24 | steps mentioned in the pull request template. Link the issue that your pull
25 | request is responding to, and format your code using [rustfmt][rustfmt].
26 |
27 | ### Configuring rustfmt
28 |
29 | Before submitting code in a PR, make sure that you have formatted the codebase
30 | using [rustfmt][rustfmt]. `rustfmt` is a tool for formatting Rust code, which
31 | helps keep style consistent across the project. If you have not used `rustfmt`
32 | before, it is not too difficult.
33 |
34 | If you have not already configured `rustfmt` for the
35 | nightly toolchain, it can be done using the following steps:
36 |
37 | **1. Use Nightly Toolchain**
38 |
39 | Install the nightly toolchain. This will only be necessary as long as rustfmt produces different results on stable and nightly.
40 |
41 | ```sh
42 | $ rustup toolchain install nightly
43 | ```
44 |
45 | **2. Add the rustfmt component**
46 |
47 | Install the most recent version of `rustfmt` using this command:
48 |
49 | ```sh
50 | $ rustup component add rustfmt --toolchain nightly
51 | ```
52 |
53 | **3. Running rustfmt**
54 |
55 | To run `rustfmt`, use this command:
56 |
57 | ```sh
58 | cargo +nightly fmt --all
59 | ```
60 |
61 | [rustfmt]: https://github.com/rust-lang-nursery/rustfmt
62 |
63 | ### IDE Configuration files
64 | Machine specific configuration files may be generaged by your IDE while working on the project. Please make sure to add these files to a global .gitignore so they are kept from accidentally being commited to the project and causing issues for other contributors.
65 |
66 | Some examples of these files are the `.idea` folder created by JetBrains products (WebStorm, IntelliJ, etc) as well as `.vscode` created by Visual Studio Code for workspace specific settings.
67 |
68 | For help setting up a global .gitignore check out this [GitHub article]!
69 |
70 | [GitHub article]: https://help.github.com/articles/ignoring-files/#create-a-global-gitignore
71 |
72 | ## Conduct
73 |
74 | This project follows the [Rust Code of Conduct](https://www.rust-lang.org/en-US/conduct.html)
--------------------------------------------------------------------------------
/Cargo.toml:
--------------------------------------------------------------------------------
1 | [workspace]
2 | members = ["dynomite", "dynomite-derive"]
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | Copyright (c) 2018-2020 Doug Tangren
2 |
3 | Permission is hereby granted, free of charge, to any person obtaining
4 | a copy of this software and associated documentation files (the
5 | "Software"), to deal in the Software without restriction, including
6 | without limitation the rights to use, copy, modify, merge, publish,
7 | distribute, sublicense, and/or sell copies of the Software, and to
8 | permit persons to whom the Software is furnished to do so, subject to
9 | the following conditions:
10 |
11 | The above copyright notice and this permission notice shall be
12 | included in all copies or substantial portions of the Software.
13 |
14 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
15 | EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
16 | MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
17 | NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
18 | LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
19 | OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
20 | WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
21 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 | 🦀🧨
3 |
4 |
5 |
6 | dynomite
7 |
8 |
9 |
10 | dynomite makes DynamoDB fit your types (and visa versa)
11 |
12 |
13 |
30 |
31 |
32 |
33 | ## Overview
34 |
35 | Goals
36 |
37 | * ⚡ make writing [dynamodb](https://aws.amazon.com/dynamodb/) applications in [rust](https://www.rust-lang.org/) a productive experience
38 | * 🦀 exploit rust's type safety features
39 | * 👩💻 leverage existing work of the [rusoto](https://github.com/rusoto/rusoto) rust project
40 | * ☔ commitment to supporting applications built using stable rust
41 | * 📚 commitment to documentation
42 |
43 | Features
44 |
45 | * 💌 less boilerplate
46 | * ♻️ automatic async pagination
47 | * 🕶️ client level retry interfaces for [robust error handling](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Programming.Errors.html)
48 |
49 |
50 | From this
51 |
52 | ```rust
53 | use std::collections::HashMap;
54 | use rusoto_dynamodb::AttributeValue;
55 | use uuid::Uuid;
56 |
57 | let mut item = HashMap::new();
58 | item.insert(
59 | "pk".to_string(), AttributeValue {
60 | s: Some(Uuid::new_v4().to_hyphenated().to_string()),
61 | ..AttributeValue::default()
62 | }
63 | );
64 | item.insert(
65 | // 🤬typos anyone?
66 | "quanity".to_string(), AttributeValue {
67 | n: Some("whoops".to_string()),
68 | ..AttributeValue::default()
69 | }
70 | );
71 | ```
72 |
73 | To this
74 |
75 | ```rust
76 | use dynomite::Item;
77 | use uuid::Uuid;
78 |
79 | #[derive(Item)]
80 | struct Order {
81 | #[dynomite(partition_key)]
82 | pk: Uuid,
83 | quantity: u16
84 | }
85 |
86 | let item = Order {
87 | pk: Uuid::new_v4(),
88 | quantity: 4
89 | }.into();
90 | ```
91 |
92 | Please see the [API documentation](https://softprops.github.io/dynomite) for how
93 | to get started. Enjoy.
94 |
95 | ## 📦 Install
96 |
97 | In your Cargo.toml file, add the following under the `[dependencies]` heading
98 |
99 | ```toml
100 | dynomite = "0.10"
101 | ```
102 |
103 | ## 🤸 Examples
104 |
105 | You can find some example application code under [dynomite/examples](dynomite/examples)
106 |
107 | ### DynamoDB local
108 |
109 | AWS provides [a convenient way to host a local instance of DynamoDB](https://hub.docker.com/r/amazon/dynamodb-local/) for
110 | testing.
111 |
112 | Here is a short example of how to get up a testing locally quickly with both dynomite as well as `rusoto_dynamodb`.
113 |
114 | In one terminal spin up a Docker container for [DynamoDB local](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.UsageNotes.html) listening on port `8000`
115 |
116 | ```sh
117 | $ docker run --rm -p 8000:8000 amazon/dynamodb-local
118 | ```
119 |
120 | In another, run a rust binary with a client initialized like you see the the [local.rs example](dynomite/examples/local.rs)
121 |
122 | ## Resources
123 |
124 | * [How DynamoDB works](https://www.slideshare.net/AmazonWebServices/amazon-dynamodb-under-the-hood-how-we-built-a-hyperscale-database-dat321-aws-reinvent-2018)
125 |
126 | Doug Tangren (softprops) 2018-2020
127 |
--------------------------------------------------------------------------------
/dynomite-derive/Cargo.toml:
--------------------------------------------------------------------------------
1 | [package]
2 | name = "dynomite-derive"
3 | version = "0.10.0"
4 | authors = ["softprops "]
5 | description = "Derives AWS DynamoDB dynomite types from native Rust struct types"
6 | license = "MIT"
7 | keywords = ["dynamodb", "rusoto", "rusoto_dynamodb"]
8 | readme = "../README.md"
9 | documentation = "https://docs.rs/dynomite-derive"
10 | homepage = "https://github.com/softprops/dynomite"
11 | repository = "https://github.com/softprops/dynomite"
12 | edition = "2018"
13 |
14 | [badges]
15 | coveralls = { repository = "softprops/dynomite" }
16 | maintenance = { status = "actively-developed" }
17 |
18 | [lib]
19 | proc-macro = true
20 |
21 | [dependencies]
22 | quote = "^1.0"
23 | syn = "^1.0"
24 | proc-macro2 = "^1.0"
25 | proc-macro-error = "1.0"
26 |
--------------------------------------------------------------------------------
/dynomite-derive/src/attr.rs:
--------------------------------------------------------------------------------
1 | //! dynomite field attributes
2 |
3 | use proc_macro_error::abort;
4 | use syn::{
5 | parse::{Parse, ParseStream},
6 | Ident, LitStr, Path, Token,
7 | };
8 |
9 | /// Represents a parsed attribute that appears in `#[dynomite(...)]`.
10 | #[derive(Clone)]
11 | pub(crate) struct Attr {
12 | /// The identifier part of the attribute (e.g. `rename` in `#[dynomite(rename = "foo"`))
13 | pub(crate) ident: Ident,
14 | /// More specific information about the metadata entry.
15 | pub(crate) kind: Kind,
16 | }
17 |
18 | /// Attribute that appears on record fields (struct fields and enum record variant fields)
19 | pub(crate) type FieldAttr = Attr;
20 | /// Attribute that appears on the top level of an enum
21 | pub(crate) type EnumAttr = Attr;
22 | /// Attribute that appears on enum varinats
23 | pub(crate) type VariantAttr = Attr;
24 |
25 | #[derive(Clone)]
26 | pub(crate) enum FieldAttrKind {
27 | /// Denotes field should be replaced with Default impl when absent in ddb
28 | Default,
29 |
30 | /// Denotes field should be renamed to value of ListStr
31 | Rename(LitStr),
32 |
33 | /// Denotes Item partition (primary) key
34 | PartitionKey,
35 |
36 | /// Denotes Item sort key
37 | SortKey,
38 |
39 | /// Denotes a field that should be replaced with all of its subfields
40 | Flatten,
41 |
42 | /// Denotes a field that should not be present in the resulting `Attributes` map
43 | /// if the given function returns `true` for its value
44 | SkipSerializingIf(Path),
45 | }
46 |
47 | impl DynomiteAttr for FieldAttrKind {
48 | const KVS: Kvs = &[
49 | ("rename", |lit| Ok(FieldAttrKind::Rename(lit))),
50 | ("skip_serializing_if", |lit| {
51 | lit.parse().map(FieldAttrKind::SkipSerializingIf)
52 | }),
53 | ];
54 | const KEYS: Keys = &[
55 | ("default", FieldAttrKind::Default),
56 | ("partition_key", FieldAttrKind::PartitionKey),
57 | ("sort_key", FieldAttrKind::SortKey),
58 | ("flatten", FieldAttrKind::Flatten),
59 | ];
60 | }
61 |
62 | #[derive(Clone)]
63 | pub(crate) enum EnumAttrKind {
64 | // FIXME: implement content attribute to support non-map values in enum variants
65 | // (adjacently tagged enums: https://serde.rs/enum-representations.html#adjacently-tagged)
66 | // Content(LitStr),
67 | /// The name of the tag field for an internally-tagged enum
68 | Tag(LitStr),
69 | }
70 |
71 | impl DynomiteAttr for EnumAttrKind {
72 | const KVS: Kvs = &[("tag", |lit| Ok(EnumAttrKind::Tag(lit)))];
73 | }
74 |
75 | #[derive(Clone)]
76 | pub(crate) enum VariantAttrKind {
77 | // TODO: add default for enum variants?
78 | Rename(LitStr),
79 | }
80 |
81 | impl DynomiteAttr for VariantAttrKind {
82 | const KVS: Kvs = &[("rename", |lit| Ok(VariantAttrKind::Rename(lit)))];
83 | }
84 |
85 | type Kvs = &'static [(&'static str, fn(syn::LitStr) -> syn::Result)];
86 | type Keys = &'static [(&'static str, T)];
87 |
88 | /// Helper to ease defining `#[dynomite(key)` and `#[dynomite(key = "val")` attributes
89 | pub(crate) trait DynomiteAttr: Clone + Sized + 'static {
90 | /// List of `("attr_name", enum_variant_constructor)` to define attributes
91 | /// that require a value string literal (e.g. `rename = "foo"`)
92 | const KVS: Kvs = &[];
93 | /// List of `("attr_name", enum_variant_value)` entires to define attributes
94 | /// that should not have any value (e.g. `default` or `flatten`)
95 | const KEYS: Keys = &[];
96 | }
97 |
98 | impl Parse for Attr {
99 | fn parse(input: ParseStream) -> syn::Result {
100 | let entry: MetadataEntry = input.parse()?;
101 | let kind = entry
102 | .try_attr_with_val(A::KVS)?
103 | .or_else(|| entry.try_attr_without_val(A::KEYS))
104 | .unwrap_or_else(|| abort!(entry.key, "unexpected dynomite attribute: {}", entry.key));
105 | Ok(Attr {
106 | ident: entry.key,
107 | kind,
108 | })
109 | }
110 | }
111 |
112 | struct MetadataEntry {
113 | key: Ident,
114 | val: Option,
115 | }
116 |
117 | impl MetadataEntry {
118 | /// Attempt to map the parsed entry to an identifier-only attribute from the list
119 | fn try_attr_without_val(
120 | &self,
121 | mappings: Keys,
122 | ) -> Option {
123 | let Self { key, val } = self;
124 | let key_str = key.to_string();
125 | mappings
126 | .iter()
127 | .find(|(key_pat, _)| *key_pat == key_str)
128 | .map(|(_, enum_val)| match val {
129 | Some(_) => abort!(key, "expected no value for dynomite attribute `{}`", key),
130 | None => enum_val.clone(),
131 | })
132 | }
133 |
134 | /// Attempt to map the parsed entry to a key-value attribute from the list
135 | fn try_attr_with_val(
136 | &self,
137 | mappings: Kvs,
138 | ) -> syn::Result> {
139 | let Self { key, val } = self;
140 | let key_str = key.to_string();
141 | mappings
142 | .iter()
143 | .find(|(key_pat, _)| *key_pat == key_str)
144 | .map(|(_, to_enum)| match val {
145 | Some(it) => to_enum(it.clone()),
146 | None => abort!(
147 | key,
148 | "expected a value for dynomite attribute: `{} = \"foo\"`",
149 | key
150 | ),
151 | })
152 | .transpose()
153 | }
154 | }
155 |
156 | impl Parse for MetadataEntry {
157 | fn parse(input: ParseStream) -> syn::Result {
158 | let key: Ident = input.parse()?;
159 | if input.peek(syn::token::Paren) {
160 | // `name(...)` attributes.
161 | abort!(key, "unexpected paren in dynomite attribute: {}", key);
162 | }
163 | Ok(Self {
164 | key,
165 | val: input
166 | .parse::()
167 | .ok()
168 | .map(|_| input.parse())
169 | .transpose()?,
170 | })
171 | }
172 | }
173 |
--------------------------------------------------------------------------------
/dynomite-derive/src/lib.rs:
--------------------------------------------------------------------------------
1 | //! Provides procedural macros for deriving dynomite types for your structs and enum types
2 | //!
3 | //! # Examples
4 | //!
5 | //! ```ignore
6 | //! use dynomite::{Item, FromAttributes, Attributes};
7 | //! use dynomite::dynamodb::AttributeValue;
8 | //!
9 | //! // derive Item
10 | //! #[derive(Item, PartialEq, Debug, Clone)]
11 | //! struct Person {
12 | //! #[dynomite(partition_key)] id: String
13 | //! }
14 | //!
15 | //! let person = Person { id: "123".into() };
16 | //! // convert person to string keys and attribute values
17 | //! let attributes: Attributes = person.clone().into();
18 | //! // convert attributes into person type
19 | //! assert_eq!(person, Person::from_attrs(attributes).unwrap());
20 | //!
21 | //! // dynamodb types require only primary key attributes and may contain
22 | //! // other fields; when looking up items only those key attributes are required
23 | //! // dynomite derives a new {Name}Key struct for your which contains
24 | //! // only those and also implements Item
25 | //! let key = PersonKey { id: "123".into() };
26 | //! let key_attributes: Attributes = key.clone().into();
27 | //! // convert attributes into person type
28 | //! assert_eq!(key, PersonKey::from_attrs(key_attributes).unwrap());
29 | //! ```
30 |
31 | mod attr;
32 | use std::collections::HashSet;
33 |
34 | use attr::{EnumAttr, EnumAttrKind, FieldAttr, FieldAttrKind, VariantAttr};
35 |
36 | use proc_macro::TokenStream;
37 | use proc_macro2::Span;
38 | use proc_macro_error::{abort, ResultExt};
39 | use quote::{quote, ToTokens};
40 | use syn::{
41 | parse::Parse, punctuated::Punctuated, Attribute, DataStruct, DeriveInput, Field, Fields, Ident,
42 | Path, Token, Visibility,
43 | };
44 |
45 | struct Variant {
46 | inner: syn::Variant,
47 | attrs: Vec,
48 | }
49 |
50 | impl Variant {
51 | fn deser_name(&self) -> String {
52 | self.attrs
53 | .iter()
54 | .find_map(|it| match &it.kind {
55 | attr::VariantAttrKind::Rename(it) => Some(it.value()),
56 | })
57 | .unwrap_or_else(|| self.inner.ident.to_string())
58 | }
59 | }
60 |
61 | struct DataEnum {
62 | attrs: Vec,
63 | ident: syn::Ident,
64 | variants: Vec,
65 | }
66 |
67 | impl DataEnum {
68 | fn new(
69 | ident: Ident,
70 | inner: syn::DataEnum,
71 | attrs: &[Attribute],
72 | ) -> Self {
73 | let me = Self {
74 | attrs: parse_attrs(attrs),
75 | ident,
76 | variants: inner
77 | .variants
78 | .into_iter()
79 | .map(|inner| {
80 | let attrs = parse_attrs(&inner.attrs);
81 | Variant { inner, attrs }
82 | })
83 | .collect(),
84 | };
85 |
86 | // Validate that all enum tag values are unique
87 | let mut unique_names = HashSet::new();
88 | for variant in &me.variants {
89 | if let Some(existing) = unique_names.replace(variant.deser_name()) {
90 | abort!(
91 | variant.inner.ident.span(),
92 | "Duplicate tag name detected: `{}`", existing;
93 | help = "Please ensure that no `rename = \"tag_value\"` \
94 | clauses conflict with each other and remaining enum variants' names"
95 | );
96 | }
97 | }
98 | me
99 | }
100 |
101 | fn tag_key(&self) -> String {
102 | self.attrs
103 | .iter()
104 | .find_map(|attr| match &attr.kind {
105 | EnumAttrKind::Tag(lit) => Some(lit.value()),
106 | })
107 | .unwrap_or_else(|| {
108 | abort!(
109 | self.ident,
110 | "#[derive(Attributes)] for fat enums must have a sibling \
111 | #[dynomite(tag = \"key\")] attribute to specify the descriptor field name.";
112 | note = "Only internally tagged enums are supported in this version of dynomite."
113 | )
114 | })
115 | }
116 |
117 | fn impl_from_attributes(&self) -> impl ToTokens {
118 | let match_arms = self.variants.iter().map(|variant| {
119 | let variant_ident = &variant.inner.ident;
120 | let expr = match &variant.inner.fields {
121 | Fields::Named(_record) => Self::unimplemented_record_variants(variant),
122 | Fields::Unnamed(tuple) => {
123 | Self::expect_single_item_tuple(tuple, variant_ident);
124 | quote! { Self::#variant_ident(::dynomite::FromAttributes::from_attrs(attrs)?) }
125 | }
126 | Fields::Unit => quote! { Self::#variant_ident },
127 | };
128 | let variant_deser_name = variant.deser_name();
129 | quote! { #variant_deser_name => #expr, }
130 | });
131 |
132 | let enum_ident = &self.ident;
133 | let tag_key = self.tag_key();
134 | quote! {
135 | impl ::dynomite::FromAttributes for #enum_ident {
136 | fn from_attrs(attrs: &mut ::dynomite::Attributes) -> ::std::result::Result {
137 | use ::std::{string::String, result::Result::{Ok, Err}};
138 | use ::dynomite::{Attribute, AttributeError};
139 |
140 | let tag = attrs.remove(#tag_key).ok_or_else(|| {
141 | AttributeError::MissingField {
142 | name: #tag_key.to_owned(),
143 | }
144 | })?;
145 | let tag: String = Attribute::from_attr(tag)?;
146 | Ok(match tag.as_str() {
147 | #(#match_arms)*
148 | _ => return Err(AttributeError::InvalidFormat)
149 | })
150 | }
151 | }
152 | }
153 | }
154 |
155 | fn impl_into_attributes(&self) -> impl ToTokens {
156 | let enum_ident = &self.ident;
157 |
158 | let match_arms = self.variants.iter().map(|variant| {
159 | let variant_ident = &variant.inner.ident;
160 | let variant_deser_name = variant.deser_name();
161 | match &variant.inner.fields {
162 | Fields::Named(_record) => Self::unimplemented_record_variants(variant),
163 | Fields::Unnamed(tuple) => {
164 | Self::expect_single_item_tuple(tuple, variant_ident);
165 |
166 | quote! {
167 | Self::#variant_ident(variant) => {
168 | ::dynomite::IntoAttributes::into_attrs(variant, attrs);
169 | #variant_deser_name
170 | }
171 | }
172 | }
173 | Fields::Unit => quote! { Self::#variant_ident => #variant_deser_name, },
174 | }
175 | });
176 |
177 | let tag_key = self.tag_key();
178 |
179 | quote! {
180 | impl ::dynomite::IntoAttributes for #enum_ident {
181 | fn into_attrs(self, attrs: &mut ::dynomite::Attributes) {
182 | let tag = match self {
183 | #(#match_arms)*
184 | };
185 | let tag = ::dynomite::Attribute::into_attr(tag.to_owned());
186 | attrs.insert(#tag_key.to_owned(), tag);
187 | }
188 | }
189 | }
190 | }
191 |
192 | fn unimplemented_record_variants(variant: &Variant) -> ! {
193 | abort!(
194 | variant.inner.ident.span(),
195 | "Record enum variants are not implemented yet."
196 | )
197 | }
198 |
199 | fn expect_single_item_tuple(
200 | tuple: &syn::FieldsUnnamed,
201 | variant_ident: &Ident,
202 | ) {
203 | if tuple.unnamed.len() != 1 {
204 | abort!(
205 | variant_ident,
206 | "Tuple variants with {} elements are not supported yet in dynomite, use \
207 | single-element tuples for now. \
208 | This restriction may be relaxed in future (follow the updates).",
209 | tuple.unnamed.len(),
210 | )
211 | }
212 | }
213 | }
214 |
215 | /// A Field and all its extracted dynomite derive attrs
216 | #[derive(Clone)]
217 | struct ItemField<'a> {
218 | field: &'a Field,
219 | attrs: Vec,
220 | }
221 |
222 | impl<'a> ItemField<'a> {
223 | fn new(field: &'a Field) -> Self {
224 | let attrs = parse_attrs(&field.attrs);
225 | let me = Self { field, attrs };
226 | if me.is_flatten() {
227 | if let Some(it) = me
228 | .attrs
229 | .iter()
230 | .find(|it| !matches!(it.kind, FieldAttrKind::Flatten))
231 | {
232 | abort!(
233 | it.ident,
234 | "If #[dynomite(flatten)] is used, no other dynomite attributes are allowed on the field"
235 | );
236 | }
237 | }
238 | me
239 | }
240 |
241 | fn is_partition_key(&self) -> bool {
242 | self.attrs
243 | .iter()
244 | .any(|attr| matches!(attr.kind, FieldAttrKind::PartitionKey))
245 | }
246 |
247 | fn is_sort_key(&self) -> bool {
248 | self.attrs
249 | .iter()
250 | .any(|attr| matches!(attr.kind, FieldAttrKind::SortKey))
251 | }
252 |
253 | fn is_default_when_absent(&self) -> bool {
254 | self.attrs
255 | .iter()
256 | .any(|attr| matches!(attr.kind, FieldAttrKind::Default))
257 | }
258 |
259 | fn skip_serializing_if(&self) -> Option<&Path> {
260 | self.attrs.iter().find_map(|attr| match &attr.kind {
261 | FieldAttrKind::SkipSerializingIf(expr) => Some(expr),
262 | _ => None,
263 | })
264 | }
265 |
266 | fn is_flatten(&self) -> bool {
267 | self.attrs
268 | .iter()
269 | .any(|attr| matches!(attr.kind, FieldAttrKind::Flatten))
270 | }
271 |
272 | fn deser_name(&self) -> String {
273 | let ItemField { field, attrs } = self;
274 | attrs
275 | .iter()
276 | .find_map(|attr| match &attr.kind {
277 | FieldAttrKind::Rename(lit) => Some(lit.value()),
278 | _ => None,
279 | })
280 | .unwrap_or_else(|| {
281 | field
282 | .ident
283 | .as_ref()
284 | .expect("should have an identifier")
285 | .to_string()
286 | })
287 | }
288 | }
289 |
290 | fn parse_attrs(all_attrs: &[Attribute]) -> Vec {
291 | all_attrs
292 | .iter()
293 | .filter(|attr| is_dynomite_attr(attr))
294 | .flat_map(|attr| {
295 | attr.parse_args_with(Punctuated:: ::parse_terminated)
296 | .unwrap_or_abort()
297 | })
298 | .collect()
299 | }
300 |
301 | /// Derives `dynomite::Item` type for struts with named fields
302 | ///
303 | /// # Attributes
304 | ///
305 | /// * `#[dynomite(partition_key)]` - required attribute, expected to be applied the target [partition attribute](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html#HowItWorks.CoreComponents.PrimaryKey) field with an derivable DynamoDB attribute value of String, Number or Binary
306 | /// * `#[dynomite(sort_key)]` - optional attribute, may be applied to one target [sort attribute](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html#HowItWorks.CoreComponents.SecondaryIndexes) field with an derivable DynamoDB attribute value of String, Number or Binary
307 | /// * `#[dynomite(rename = "actualName")]` - optional attribute, may be applied any item attribute field, useful when the DynamoDB table you're interfacing with has attributes whose names don't following Rust's naming conventions
308 | ///
309 | /// # Panics
310 | ///
311 | /// This proc macro will panic when applied to other types
312 | #[proc_macro_error::proc_macro_error]
313 | #[proc_macro_derive(Item, attributes(partition_key, sort_key, dynomite))]
314 | pub fn derive_item(input: TokenStream) -> TokenStream {
315 | let ast = syn::parse_macro_input!(input);
316 |
317 | let gen = match expand_item(ast) {
318 | Ok(g) => g,
319 | Err(e) => return e.to_compile_error().into(),
320 | };
321 |
322 | gen.into_token_stream().into()
323 | }
324 |
325 | /// similar in spirit to `#[derive(Item)]` except these are exempt from declaring
326 | /// partition and sort keys
327 | #[proc_macro_error::proc_macro_error]
328 | #[proc_macro_derive(Attributes, attributes(dynomite))]
329 | pub fn derive_attributes(input: TokenStream) -> TokenStream {
330 | let ast = syn::parse_macro_input!(input);
331 | expand_attributes(ast).unwrap_or_else(|e| e.to_compile_error().into())
332 | }
333 |
334 | /// Derives `dynomite::Attribute` for enum types
335 | ///
336 | /// # Panics
337 | ///
338 | /// This proc macro will panic when applied to other types
339 | #[proc_macro_error::proc_macro_error]
340 | #[proc_macro_derive(Attribute)]
341 | pub fn derive_attribute(input: TokenStream) -> TokenStream {
342 | let ast = syn::parse_macro_input!(input);
343 | let gen = expand_attribute(ast);
344 | gen.into_token_stream().into()
345 | }
346 |
347 | fn expand_attribute(ast: DeriveInput) -> impl ToTokens {
348 | let name = &ast.ident;
349 | match ast.data {
350 | syn::Data::Enum(variants) => {
351 | make_dynomite_attr(name, &variants.variants.into_iter().collect::>())
352 | }
353 | _ => panic!("Dynomite Attributes can only be generated for enum types"),
354 | }
355 | }
356 |
357 | /// ```rust,ignore
358 | /// impl ::dynomite::Attribute for Name {
359 | /// fn into_attr(self) -> ::dynomite::dynamodb::AttributeValue {
360 | /// let arm = match self {
361 | /// Name::Variant => "Variant".to_string()
362 | /// };
363 | /// ::dynomite::dynamodb::AttributeValue {
364 | /// s: Some(arm),
365 | /// ..Default::default()
366 | /// }
367 | /// }
368 | /// fn from_attr(value: ::dynomite::dynamodb::AttributeValue) -> Result {
369 | /// value.s.ok_or(::dynomite::AttributeError::InvalidType)
370 | /// .and_then(|value| match &value[..] {
371 | /// "Variant" => Ok(Name::Variant),
372 | /// _ => Err(::dynomite::AttributeError::InvalidFormat)
373 | /// })
374 | /// }
375 | /// }
376 | /// ```
377 | fn make_dynomite_attr(
378 | name: &Ident,
379 | variants: &[syn::Variant],
380 | ) -> impl ToTokens {
381 | let attr = quote!(::dynomite::Attribute);
382 | let err = quote!(::dynomite::AttributeError);
383 | let into_match_arms = variants.iter().map(|var| {
384 | let vname = &var.ident;
385 | quote! {
386 | #name::#vname => stringify!(#vname).to_string(),
387 | }
388 | });
389 | let from_match_arms = variants.iter().map(|var| {
390 | let vname = &var.ident;
391 | quote! {
392 | stringify!(#vname) => ::std::result::Result::Ok(#name::#vname),
393 | }
394 | });
395 |
396 | quote! {
397 | impl #attr for #name {
398 | fn into_attr(self) -> ::dynomite::dynamodb::AttributeValue {
399 | let arm = match self {
400 | #(#into_match_arms)*
401 | };
402 | ::dynomite::dynamodb::AttributeValue {
403 | s: ::std::option::Option::Some(arm),
404 | ..::std::default::Default::default()
405 | }
406 | }
407 | fn from_attr(value: ::dynomite::dynamodb::AttributeValue) -> ::std::result::Result {
408 | value.s.ok_or(::dynomite::AttributeError::InvalidType)
409 | .and_then(|value| match &value[..] {
410 | #(#from_match_arms)*
411 | _ => ::std::result::Result::Err(::dynomite::AttributeError::InvalidFormat)
412 | })
413 | }
414 | }
415 | }
416 | }
417 |
418 | fn expand_attributes(ast: DeriveInput) -> syn::Result {
419 | use syn::spanned::Spanned as _;
420 | let name = ast.ident;
421 | let tokens = match ast.data {
422 | syn::Data::Struct(DataStruct { fields, .. }) => match fields {
423 | Fields::Named(named) => {
424 | make_dynomite_attrs_for_struct(&name, &named.named.into_iter().collect::>())
425 | .into_token_stream()
426 | }
427 | fields => {
428 | return Err(syn::Error::new(
429 | fields.span(),
430 | "Dynomite Attributes require named fields",
431 | ))
432 | }
433 | },
434 | syn::Data::Enum(data_enum) => {
435 | make_dynomite_attrs_for_enum(&DataEnum::new(name, data_enum, &ast.attrs))
436 | .into_token_stream()
437 | }
438 | _ => panic!("Dynomite Attributes can only be generated for structs"),
439 | };
440 | Ok(tokens.into())
441 | }
442 |
443 | fn expand_item(ast: DeriveInput) -> syn::Result {
444 | use syn::spanned::Spanned as _;
445 | let name = &ast.ident;
446 | let vis = &ast.vis;
447 | match ast.data {
448 | syn::Data::Struct(DataStruct { fields, .. }) => match fields {
449 | Fields::Named(named) => {
450 | make_dynomite_item(vis, name, &named.named.into_iter().collect::>())
451 | }
452 | fields => Err(syn::Error::new(
453 | fields.span(),
454 | "Dynomite Items require named fields",
455 | )),
456 | },
457 | _ => panic!("Dynomite Items can only be generated for structs"),
458 | }
459 | }
460 |
461 | fn make_dynomite_attrs_for_enum(enum_item: &DataEnum) -> impl ToTokens {
462 | let from_attributes = enum_item.impl_from_attributes();
463 | let into_attributes = enum_item.impl_into_attributes();
464 | let std_into_attrs = get_std_convert_traits(&enum_item.ident);
465 |
466 | quote! {
467 | #from_attributes
468 | #into_attributes
469 | #std_into_attrs
470 | }
471 | }
472 |
473 | fn make_dynomite_attrs_for_struct(
474 | name: &Ident,
475 | fields: &[Field],
476 | ) -> impl ToTokens {
477 | let item_fields = fields.iter().map(ItemField::new).collect::>();
478 | // impl ::dynomite::FromAttributes for Name
479 | let from_attribute_map = get_from_attributes_trait(name, &item_fields);
480 | // impl ::dynomite::IntoAttributes for Name
481 | // impl From for ::dynomite::Attributes
482 | let to_attribute_map = get_into_attribute_map_trait(name, &item_fields);
483 | // impl TryFrom<::dynomite::Attributes> for Name
484 | // impl From for ::dynomite::Attributes
485 | let std_into_attrs = get_std_convert_traits(name);
486 |
487 | quote! {
488 | #from_attribute_map
489 | #to_attribute_map
490 | #std_into_attrs
491 | }
492 | }
493 |
494 | fn make_dynomite_item(
495 | vis: &Visibility,
496 | name: &Ident,
497 | fields: &[Field],
498 | ) -> syn::Result {
499 | let item_fields = fields.iter().map(ItemField::new).collect::>();
500 | // all items must have 1 primary_key
501 | let partition_key_count = item_fields.iter().filter(|f| f.is_partition_key()).count();
502 | if partition_key_count != 1 {
503 | return Err(syn::Error::new(
504 | name.span(),
505 | format!(
506 | "All Item's must declare one and only one partition_key. The `{}` Item declared {}",
507 | name, partition_key_count
508 | ),
509 | ));
510 | }
511 | // impl Item for Name + NameKey struct
512 | let dynamodb_traits = get_dynomite_item_traits(vis, name, &item_fields)?;
513 | // impl ::dynomite::FromAttributes for Name
514 | let from_attribute_map = get_from_attributes_trait(name, &item_fields);
515 | // impl ::dynomite::IntoAttributes for Name
516 | let to_attribute_map = get_into_attribute_map_trait(name, &item_fields);
517 | // impl TryFrom<::dynomite::Attributes> for Name
518 | // impl From for ::dynomite::Attributes
519 | let std_into_attrs = get_std_convert_traits(name);
520 |
521 | Ok(quote! {
522 | #from_attribute_map
523 | #to_attribute_map
524 | #std_into_attrs
525 | #dynamodb_traits
526 | })
527 | }
528 |
529 | fn get_into_attribute_map_trait(
530 | name: &Ident,
531 | fields: &[ItemField],
532 | ) -> impl ToTokens {
533 | let into_attrs = get_into_attrs(fields);
534 |
535 | quote! {
536 | impl ::dynomite::IntoAttributes for #name {
537 | #into_attrs
538 | }
539 | }
540 | }
541 |
542 | fn get_std_convert_traits(entity_name: &Ident) -> impl ToTokens {
543 | quote! {
544 | impl ::std::convert::TryFrom<::dynomite::Attributes> for #entity_name {
545 | type Error = ::dynomite::AttributeError;
546 |
547 | fn try_from(mut attrs: ::dynomite::Attributes) -> ::std::result::Result {
548 | ::dynomite::FromAttributes::from_attrs(&mut attrs)
549 | }
550 | }
551 |
552 | impl ::std::convert::From<#entity_name> for ::dynomite::Attributes {
553 | fn from(entity: #entity_name) -> Self {
554 | let mut map = ::dynomite::Attributes::new();
555 | ::dynomite::IntoAttributes::into_attrs(entity, &mut map);
556 | map
557 | }
558 | }
559 | }
560 | }
561 |
562 | fn get_into_attrs(fields: &[ItemField]) -> impl ToTokens {
563 | let field_conversions = fields.iter().map(|field| {
564 | let field_deser_name = field.deser_name();
565 | let field_ident = &field.field.ident;
566 |
567 | let insert_attr = quote! {
568 | attrs.insert(
569 | #field_deser_name.to_string(),
570 | ::dynomite::Attribute::into_attr(self.#field_ident)
571 | );
572 | };
573 |
574 | if let Some(skip_serializing_if) = field.skip_serializing_if() {
575 | quote! {
576 | if !#skip_serializing_if(&self.#field_ident) {
577 | #insert_attr
578 | }
579 | }
580 | } else if field.is_flatten() {
581 | quote! {
582 | ::dynomite::IntoAttributes::into_attrs(self.#field_ident, attrs);
583 | }
584 | } else {
585 | insert_attr
586 | }
587 | });
588 |
589 | quote! {
590 | fn into_attrs(self, attrs: &mut ::dynomite::Attributes) {
591 | #(#field_conversions)*
592 | }
593 | }
594 | }
595 |
596 | /// ```rust,ignore
597 | /// impl ::dynomite::FromAttributes for Name {
598 | /// fn from_attrs(attrs: &mut ::dynomite::Attributes) -> Result {
599 | /// let field_name = ::dynomite::Attribute::from_attr(
600 | /// attrs.remove("field_deser_name").ok_or_else(|| Error::MissingField { name: "field_deser_name".to_string() })?
601 | /// );
602 | /// Ok(Self {
603 | /// field_name,
604 | /// })
605 | /// }
606 | /// }
607 | /// ```
608 | fn get_from_attributes_trait(
609 | name: &Ident,
610 | fields: &[ItemField],
611 | ) -> impl ToTokens {
612 | let from_attrs = quote!(::dynomite::FromAttributes);
613 | let from_attrs_fn = get_from_attrs_function(fields);
614 |
615 | quote! {
616 | impl #from_attrs for #name {
617 | #from_attrs_fn
618 | }
619 | }
620 | }
621 |
622 | fn get_from_attrs_function(fields: &[ItemField]) -> impl ToTokens {
623 | let var_init_statements = fields
624 | .iter()
625 | .map(|field| {
626 | // field might have #[dynomite(rename = "...")] attribute
627 | let field_deser_name = field.deser_name();
628 | let field_ident = &field.field.ident;
629 | let expr = if field.is_default_when_absent() {
630 | quote! {
631 | match attrs.remove(#field_deser_name) {
632 | Some(field) => ::dynomite::Attribute::from_attr(field)?,
633 | _ => ::std::default::Default::default()
634 | }
635 | }
636 | } else if field.is_flatten() {
637 | quote! { ::dynomite::FromAttributes::from_attrs(attrs)? }
638 | } else {
639 | quote! {
640 | ::dynomite::Attribute::from_attr(
641 | attrs.remove(#field_deser_name).ok_or_else(|| ::dynomite::AttributeError::MissingField {
642 | name: #field_deser_name.to_string()
643 | })?
644 | )?
645 | }
646 | };
647 | quote! {
648 | let #field_ident = #expr;
649 | }
650 | });
651 |
652 | let field_names = fields.iter().map(|it| &it.field.ident);
653 |
654 | // The order of evaluation of struct literal fields seems
655 | // **informally** left-to-right (as per Niko Matsakis and Steve Klabnik),
656 | // https://stackoverflow.com/a/57612600/9259330
657 | // This means we should not rely on this behavior yet.
658 | // We explicitly make conversion expressions a separate statements.
659 | // This is important, because the order of declaration and evaluation
660 | // of `flatten` fields matters.
661 |
662 | quote! {
663 | fn from_attrs(attrs: &mut ::dynomite::Attributes) -> ::std::result::Result {
664 | #(#var_init_statements)*
665 | ::std::result::Result::Ok(Self {
666 | #(#field_names),*
667 | })
668 | }
669 | }
670 | }
671 |
672 | fn get_dynomite_item_traits(
673 | vis: &Visibility,
674 | name: &Ident,
675 | fields: &[ItemField],
676 | ) -> syn::Result {
677 | let impls = get_item_impls(vis, name, fields)?;
678 |
679 | Ok(quote! {
680 | #impls
681 | })
682 | }
683 |
684 | fn get_item_impls(
685 | vis: &Visibility,
686 | name: &Ident,
687 | fields: &[ItemField],
688 | ) -> syn::Result {
689 | // impl ::dynomite::Item for Name ...
690 | let item_trait = get_item_trait(name, fields)?;
691 | // pub struct NameKey ...
692 | let key_struct = get_key_struct(vis, name, fields)?;
693 |
694 | Ok(quote! {
695 | #item_trait
696 | #key_struct
697 | })
698 | }
699 |
700 | /// ```rust,ignore
701 | /// impl ::dynomite::Item for Name {
702 | /// fn key(&self) -> ::std::collections::HashMap {
703 | /// let mut keys = ::std::collections::HashMap::new();
704 | /// keys.insert("field_deser_name", to_attribute_value(field));
705 | /// keys
706 | /// }
707 | /// }
708 | /// ```
709 | fn get_item_trait(
710 | name: &Ident,
711 | fields: &[ItemField],
712 | ) -> syn::Result {
713 | let item = quote!(::dynomite::Item);
714 | let attribute_map = quote!(
715 | ::std::collections::HashMap
716 | );
717 | let partition_key_field = fields.iter().find(|f| f.is_partition_key());
718 | let sort_key_field = fields.iter().find(|f| f.is_sort_key());
719 | let partition_key_insert = partition_key_field.map(get_key_inserter).transpose()?;
720 | let sort_key_insert = sort_key_field.map(get_key_inserter).transpose()?;
721 |
722 | Ok(partition_key_field
723 | .map(|_| {
724 | quote! {
725 | impl #item for #name {
726 | fn key(&self) -> #attribute_map {
727 | let mut keys = ::std::collections::HashMap::new();
728 | #partition_key_insert
729 | #sort_key_insert
730 | keys
731 | }
732 | }
733 | }
734 | })
735 | .unwrap_or_else(proc_macro2::TokenStream::new))
736 | }
737 |
738 | /// ```rust,ignore
739 | /// keys.insert(
740 | /// "field_deser_name", to_attribute_value(field)
741 | /// );
742 | /// ```
743 | fn get_key_inserter(field: &ItemField) -> syn::Result {
744 | let to_attribute_value = quote!(::dynomite::Attribute::into_attr);
745 |
746 | let field_deser_name = field.deser_name();
747 | let field_ident = &field.field.ident;
748 | Ok(quote! {
749 | keys.insert(
750 | #field_deser_name.to_string(),
751 | #to_attribute_value(self.#field_ident.clone())
752 | );
753 | })
754 | }
755 |
756 | /// ```rust,ignore
757 | /// #[derive(Item, Debug, Clone, PartialEq)]
758 | /// pub struct NameKey {
759 | /// partition_key_field,
760 | /// range_key
761 | /// }
762 | /// ```
763 | fn get_key_struct(
764 | vis: &Visibility,
765 | name: &Ident,
766 | fields: &[ItemField],
767 | ) -> syn::Result {
768 | let name = Ident::new(&format!("{}Key", name), Span::call_site());
769 |
770 | let partition_key_field = fields
771 | .iter()
772 | .find(|field| field.is_partition_key())
773 | .cloned()
774 | .map(|field| {
775 | // clone because this is a new struct
776 | // note: this in inherits field attrs so that
777 | // we retain dynomite(rename = "xxx")
778 | let mut field = field.field.clone();
779 | field.attrs.retain(is_dynomite_attr);
780 |
781 | quote! {
782 | #field
783 | }
784 | });
785 |
786 | let sort_key_field = fields
787 | .iter()
788 | .find(|field| field.is_sort_key())
789 | .cloned()
790 | .map(|field| {
791 | // clone because this is a new struct
792 | // note: this in inherits field attrs so that
793 | // we retain dynomite(rename = "xxx")
794 | let mut field = field.field.clone();
795 | field.attrs.retain(is_dynomite_attr);
796 |
797 | quote! {
798 | #field
799 | }
800 | });
801 |
802 | Ok(partition_key_field
803 | .map(|partition_key_field| {
804 | quote! {
805 | #[derive(::dynomite::Attributes, Debug, Clone, PartialEq)]
806 | #vis struct #name {
807 | #partition_key_field,
808 | #sort_key_field
809 | }
810 | }
811 | })
812 | .unwrap_or_else(proc_macro2::TokenStream::new))
813 | }
814 |
815 | fn is_dynomite_attr(suspect: &syn::Attribute) -> bool {
816 | suspect.path.is_ident("dynomite")
817 | }
818 |
--------------------------------------------------------------------------------
/dynomite/Cargo.toml:
--------------------------------------------------------------------------------
1 | [package]
2 | name = "dynomite"
3 | version = "0.10.0"
4 | authors = ["softprops "]
5 | description = "Provides set of high-level productive DynamoDB interfaces"
6 | license = "MIT"
7 | keywords = ["dynamodb", "rusoto", "rusoto_dynamodb"]
8 | readme = "../README.md"
9 | documentation = "https://docs.rs/dynomite"
10 | homepage = "https://github.com/softprops/dynomite"
11 | repository = "https://github.com/softprops/dynomite"
12 | edition = "2018"
13 | categories = ["database"]
14 |
15 | [badges]
16 | coveralls = { repository = "softprops/dynomite" }
17 | maintenance = { status = "actively-developed" }
18 |
19 | [dependencies]
20 | async-trait = "0.1"
21 | again = "0.1"
22 | bytes = "1"
23 | dynomite-derive = { version = "0.10.0", path = "../dynomite-derive", optional = true }
24 | futures = "0.3"
25 | log = "0.4"
26 | # Disable default features since the `rustls` variant requires it. We re-enable `default` in our
27 | # `default` build configuration - see the [features] below.
28 | rusoto_core = { version = "0.47", optional = true, default_features = false }
29 | rusoto_dynamodb = { version = "0.47", optional = true, default_features = false }
30 | uuid = { version = "0.8", features = ["v4"], optional = true }
31 | chrono = { version = "0.4", optional = true }
32 |
33 | [dev-dependencies]
34 | env_logger = "0.8"
35 | maplit = "1.0"
36 | serde = "1.0"
37 | serde_json = "1.0"
38 | tokio = { version = "1", features = ["macros"] }
39 | lambda_http = { git = "https://github.com/awslabs/aws-lambda-rust-runtime/", branch = "master"}
40 | trybuild = "1.0"
41 | rustversion = "1.0"
42 | dynomite-derive = { version = "0.10.0", path = "../dynomite-derive" } # required by trybuild
43 | pretty_assertions = "0.7"
44 |
45 | [features]
46 | default = [
47 | "uuid",
48 | "chrono",
49 | "derive",
50 | "rusoto_core",
51 | "rusoto_dynamodb",
52 | # Enable the `default` features of these crates.
53 | "rusoto_core/default",
54 | "rusoto_dynamodb/default"
55 | ]
56 |
57 | rustls = [
58 | "uuid",
59 | "chrono",
60 | "derive",
61 | "rusoto_core",
62 | "rusoto_dynamodb",
63 | "rusoto_core/rustls",
64 | "rusoto_dynamodb/rustls"
65 | ]
66 | derive = ["dynomite-derive"]
67 |
--------------------------------------------------------------------------------
/dynomite/examples/demo.rs:
--------------------------------------------------------------------------------
1 | use dynomite::{
2 | attr_map,
3 | dynamodb::{
4 | AttributeDefinition, CreateTableInput, DynamoDb, DynamoDbClient, GetItemInput,
5 | KeySchemaElement, ProvisionedThroughput, PutItemInput, ScanInput,
6 | },
7 | retry::Policy,
8 | Attributes, DynamoDbExt, Item, Retries,
9 | };
10 | use futures::{future, TryStreamExt};
11 | use rusoto_core::Region;
12 | use std::{convert::TryFrom, error::Error};
13 | use uuid::Uuid;
14 |
15 | #[derive(Attributes, Debug, Clone)]
16 | pub struct Author {
17 | id: Uuid,
18 | #[dynomite(default)]
19 | name: String,
20 | }
21 |
22 | #[derive(Item, Debug, Clone)]
23 | pub struct Book {
24 | #[dynomite(partition_key)]
25 | id: Uuid,
26 | #[dynomite(rename = "bookTitle")]
27 | title: String,
28 | authors: Option>,
29 | }
30 |
31 | /// create a book table with a single string (S) primary key.
32 | /// if this table does not already exists
33 | /// this may take a second or two to provision.
34 | /// it will fail if this table already exists but that's okay,
35 | /// this is just an example :)
36 | async fn bootstrap(
37 | client: &D,
38 | table_name: String,
39 | ) where
40 | D: DynamoDb,
41 | {
42 | let _ = client
43 | .create_table(CreateTableInput {
44 | table_name,
45 | key_schema: vec![KeySchemaElement {
46 | attribute_name: "id".into(),
47 | key_type: "HASH".into(),
48 | }],
49 | attribute_definitions: vec![AttributeDefinition {
50 | attribute_name: "id".into(),
51 | attribute_type: "S".into(),
52 | }],
53 | provisioned_throughput: Some(ProvisionedThroughput {
54 | read_capacity_units: 1,
55 | write_capacity_units: 1,
56 | }),
57 | ..CreateTableInput::default()
58 | })
59 | .await;
60 | }
61 |
62 | // this will create a rust book shelf in your aws account!
63 | #[tokio::main]
64 | async fn main() -> Result<(), Box> {
65 | env_logger::init();
66 | // create rusoto client
67 | let client = DynamoDbClient::new(Region::default()).with_retries(Policy::default());
68 |
69 | let table_name = "books".to_string();
70 |
71 | bootstrap(&client, table_name.clone()).await;
72 |
73 | let authors = Some(vec![Author {
74 | id: Uuid::new_v4(),
75 | name: "Jo Bloggs".into(),
76 | }]);
77 |
78 | let book = Book {
79 | id: Uuid::new_v4(),
80 | title: "rust".into(),
81 | authors,
82 | };
83 |
84 | // print the key for this book
85 | // requires bringing `dynomite::Item` into scope
86 | println!("book.key() {:#?}", book.key());
87 |
88 | // add a book to the shelf
89 | println!(
90 | "put_item() result {:#?}",
91 | client
92 | .put_item(PutItemInput {
93 | table_name: table_name.clone(),
94 | item: book.clone().into(), // <= convert book into it's attribute map representation
95 | ..PutItemInput::default()
96 | })
97 | .await?
98 | );
99 |
100 | println!(
101 | "put_item() result {:#?}",
102 | client
103 | .put_item(PutItemInput {
104 | table_name: table_name.clone(),
105 | // convert book into it's attribute map representation
106 | item: Book {
107 | id: Uuid::new_v4(),
108 | title: "rust and beyond".into(),
109 | authors: Some(vec![Author {
110 | id: Uuid::new_v4(),
111 | name: "Jim Ferris".into(),
112 | }]),
113 | }
114 | .into(),
115 | ..PutItemInput::default()
116 | })
117 | .await?
118 | );
119 |
120 | // scan through all pages of results in the books table for books who's title is "rust"
121 | println!(
122 | "scan result {:#?}",
123 | client
124 | .clone()
125 | .scan_pages(ScanInput {
126 | limit: Some(1), // to demonstrate we're getting through more than one page
127 | table_name: table_name.clone(),
128 | filter_expression: Some("bookTitle = :title".into()),
129 | expression_attribute_values: Some(attr_map!(
130 | ":title" => "rust".to_string()
131 | )),
132 | ..ScanInput::default()
133 | })
134 | .try_for_each(|item| {
135 | println!("stream_scan() item {:#?}", Book::try_from(item));
136 | future::ready(Ok(()))
137 | })
138 | .await? // attempt to convert a attribute map to a book type
139 | );
140 |
141 | // get the "rust' book by the Book type's generated key
142 | println!(
143 | "get_item() result {:#?}",
144 | client
145 | .get_item(GetItemInput {
146 | table_name,
147 | key: book.key(), // get a book by key
148 | ..GetItemInput::default()
149 | })
150 | .await?
151 | .item
152 | .map(Book::try_from) // attempt to convert a attribute map to a book type
153 | );
154 | Ok(())
155 | }
156 |
--------------------------------------------------------------------------------
/dynomite/examples/lambda.rs:
--------------------------------------------------------------------------------
1 | use dynomite::{
2 | dynamodb::{DynamoDb, DynamoDbClient},
3 | retry::Policy,
4 | Retries,
5 | };
6 | use lambda_http::{handler, lambda_runtime};
7 |
8 | type Error = Box;
9 |
10 | #[tokio::main]
11 | async fn main() -> Result<(), Error> {
12 | let client = DynamoDbClient::new(Default::default()).with_retries(Policy::default());
13 |
14 | lambda_runtime::run(handler(move |_, _| {
15 | let client = client.clone();
16 | async move {
17 | let tables = client
18 | .list_tables(Default::default())
19 | .await?
20 | .table_names
21 | .unwrap_or_default();
22 | Ok::<_, Error>(tables.join("\n"))
23 | }
24 | }))
25 | .await?;
26 |
27 | Ok(())
28 | }
29 |
--------------------------------------------------------------------------------
/dynomite/examples/local.rs:
--------------------------------------------------------------------------------
1 | /// Assumes a you are running the following `dynamodb-local`
2 | /// on your host machine
3 | ///
4 | /// ```bash
5 | /// $ docker run -p 8000:8000 amazon/dynamodb-local
6 | /// ```
7 | use dynomite::{
8 | attr_map,
9 | dynamodb::{
10 | AttributeDefinition, CreateTableInput, DynamoDb, DynamoDbClient, GetItemInput,
11 | KeySchemaElement, ProvisionedThroughput, PutItemInput, ScanInput,
12 | },
13 | retry::Policy,
14 | DynamoDbExt, Item, Retries,
15 | };
16 | use futures::{future, TryStreamExt};
17 | use rusoto_core::Region;
18 | use std::{convert::TryFrom, error::Error};
19 | use uuid::Uuid;
20 |
21 | #[derive(Item, Debug, Clone)]
22 | pub struct Book {
23 | #[dynomite(partition_key, rename = "Id")]
24 | id: Uuid,
25 | #[dynomite(rename = "bookTitle", default)]
26 | title: String,
27 | }
28 |
29 | /// create a book table with a single string (S) primary key.
30 | /// if this table does not already exists
31 | /// this may take a second or two to provision.
32 | /// it will fail if this table already exists but that's okay,
33 | /// this is just an example :)
34 | async fn bootstrap(
35 | client: &D,
36 | table_name: String,
37 | ) where
38 | D: DynamoDb,
39 | {
40 | let _ = client
41 | .create_table(CreateTableInput {
42 | table_name,
43 | key_schema: vec![KeySchemaElement {
44 | attribute_name: "Id".into(),
45 | key_type: "HASH".into(),
46 | }],
47 | attribute_definitions: vec![AttributeDefinition {
48 | attribute_name: "Id".into(),
49 | attribute_type: "S".into(),
50 | }],
51 | provisioned_throughput: Some(ProvisionedThroughput {
52 | read_capacity_units: 1,
53 | write_capacity_units: 1,
54 | }),
55 | ..CreateTableInput::default()
56 | })
57 | .await;
58 | }
59 |
60 | // this will create a rust book shelf in your aws account!
61 | #[tokio::main]
62 | async fn main() -> Result<(), Box> {
63 | env_logger::init();
64 | // create rusoto client
65 | let client = DynamoDbClient::new(Region::Custom {
66 | name: "us-east-1".into(),
67 | endpoint: "http://localhost:8000".into(),
68 | })
69 | .with_retries(Policy::default());
70 |
71 | let table_name = "books".to_string();
72 |
73 | bootstrap(&client, table_name.clone()).await;
74 |
75 | let book = Book {
76 | id: Uuid::new_v4(),
77 | title: "rust".into(),
78 | };
79 |
80 | // print the key for this book
81 | // requires bringing `dynomite::Item` into scope
82 | println!("book.key() {:#?}", book.key());
83 |
84 | // add a book to the shelf
85 | println!(
86 | "put_item() result {:#?}",
87 | client
88 | .put_item(PutItemInput {
89 | table_name: table_name.clone(),
90 | item: book.clone().into(), // <= convert book into it's attribute map representation
91 | ..PutItemInput::default()
92 | })
93 | .await?
94 | );
95 |
96 | println!(
97 | "put_item() result {:#?}",
98 | client
99 | .put_item(PutItemInput {
100 | table_name: table_name.clone(),
101 | // convert book into it's attribute map representation
102 | item: Book {
103 | id: Uuid::new_v4(),
104 | title: "rust and beyond".into(),
105 | }
106 | .into(),
107 | ..PutItemInput::default()
108 | })
109 | .await?
110 | );
111 |
112 | // scan through all pages of results in the books table for books who's title is "rust"
113 | println!(
114 | "scan result {:#?}",
115 | client
116 | .clone()
117 | .scan_pages(ScanInput {
118 | limit: Some(1), // to demonstrate we're getting through more than one page
119 | table_name: table_name.clone(),
120 | filter_expression: Some("bookTitle = :title".into()),
121 | expression_attribute_values: Some(attr_map!(
122 | ":title" => "rust".to_string()
123 | )),
124 | ..ScanInput::default()
125 | })
126 | .try_for_each(|item| {
127 | println!("stream_scan() item {:#?}", Book::try_from(item));
128 | future::ready(Ok(()))
129 | })
130 | .await? // attempt to convert a attribute map to a book type
131 | );
132 |
133 | // get the "rust' book by the Book type's generated key
134 | println!(
135 | "get_item() result {:#?}",
136 | client
137 | .get_item(GetItemInput {
138 | table_name,
139 | key: book.key(), // get a book by key
140 | ..GetItemInput::default()
141 | })
142 | .await?
143 | .item
144 | .map(Book::try_from) // attempt to convert a attribute map to a book type
145 | );
146 |
147 | Ok(())
148 | }
149 |
--------------------------------------------------------------------------------
/dynomite/examples/stack.cf.yml:
--------------------------------------------------------------------------------
1 | Resources:
2 | DDBTable:
3 | Type: AWS::DynamoDB::Table
4 | Properties:
5 | TableName: "books"
6 | AttributeDefinitions:
7 | - AttributeName: "id"
8 | AttributeType: "S"
9 | - AttributeName: "title"
10 | AttributeType: "S"
11 | KeySchema:
12 | - AttributeName: "id"
13 | KeyType: "HASH"
14 | ProvisionedThroughput:
15 | ReadCapacityUnits: 1
16 | WriteCapacityUnits: 1
17 |
18 |
--------------------------------------------------------------------------------
/dynomite/src/error.rs:
--------------------------------------------------------------------------------
1 | //! Dynomite error types
2 | use std::{error::Error, fmt};
3 |
4 | /// Errors that may result of attribute value conversions
5 | #[derive(Debug, PartialEq)]
6 | pub enum AttributeError {
7 | /// Will be returned if an AttributeValue is present, and is of the expected
8 | /// type but its contents are not well-formatted
9 | InvalidFormat,
10 | /// Will be returned if provided AttributeValue is not of the expected type
11 | InvalidType,
12 | /// Will be returned if provided attributes does not included an
13 | /// expected named value
14 | MissingField {
15 | /// Name of the field that is missing
16 | name: String,
17 | },
18 | }
19 |
20 | impl fmt::Display for AttributeError {
21 | fn fmt(
22 | &self,
23 | f: &mut fmt::Formatter<'_>,
24 | ) -> fmt::Result {
25 | match self {
26 | AttributeError::InvalidFormat => write!(f, "Invalid format"),
27 | AttributeError::InvalidType => write!(f, "Invalid type"),
28 | AttributeError::MissingField { name } => write!(f, "Missing field {}", name),
29 | }
30 | }
31 | }
32 |
33 | impl Error for AttributeError {}
34 |
35 | #[cfg(test)]
36 | mod tests {
37 | use super::AttributeError;
38 | use std::error::Error;
39 |
40 | #[test]
41 | fn attribute_error_impl_std_error() {
42 | fn test(_: impl Error) {}
43 | test(AttributeError::InvalidFormat)
44 | }
45 |
46 | #[test]
47 | fn invalid_format_displays() {
48 | assert_eq!(
49 | "Invalid format",
50 | format!("{}", AttributeError::InvalidFormat)
51 | )
52 | }
53 |
54 | #[test]
55 | fn invalid_type_displays() {
56 | assert_eq!("Invalid type", format!("{}", AttributeError::InvalidType))
57 | }
58 |
59 | #[test]
60 | fn missing_field_displays() {
61 | assert_eq!(
62 | "Missing field foo",
63 | format!("{}", AttributeError::MissingField { name: "foo".into() })
64 | )
65 | }
66 | }
67 |
--------------------------------------------------------------------------------
/dynomite/src/ext.rs:
--------------------------------------------------------------------------------
1 | //! Extention interfaces for rusoto `DynamoDb`
2 |
3 | use crate::dynamodb::{
4 | AttributeValue, BackupSummary, DynamoDb, ListBackupsError, ListBackupsInput, ListTablesError,
5 | ListTablesInput, QueryError, QueryInput, ScanError, ScanInput,
6 | };
7 | use futures::{stream, Stream, TryStreamExt};
8 | use rusoto_core::RusotoError;
9 | use std::{collections::HashMap, pin::Pin};
10 |
11 | type DynomiteStream = Pin>> + Send>>;
12 |
13 | /// Extension methods for DynamoDb client types
14 | ///
15 | /// A default impl is provided for `DynamoDb Clone + Send + Sync + 'static` which adds autopaginating `Stream` interfaces that require
16 | /// taking ownership.
17 | pub trait DynamoDbExt {
18 | // see https://github.com/boto/botocore/blob/6906e8e7e8701c80f0b270c42be509cff4375e38/botocore/data/dynamodb/2012-08-10/paginators-1.json
19 |
20 | /// An auto-paginating `Stream` oriented version of `list_backups`
21 | fn list_backups_pages(
22 | self,
23 | input: ListBackupsInput,
24 | ) -> DynomiteStream;
25 |
26 | /// An auto-paginating `Stream` oriented version of `list_tables`
27 | fn list_tables_pages(
28 | self,
29 | input: ListTablesInput,
30 | ) -> DynomiteStream;
31 |
32 | /// An auto-paginating `Stream` oriented version of `query`
33 | fn query_pages(
34 | self,
35 | input: QueryInput,
36 | ) -> DynomiteStream, QueryError>;
37 |
38 | /// An auto-paginating `Stream` oriented version of `scan`
39 | fn scan_pages(
40 | self,
41 | input: ScanInput,
42 | ) -> DynomiteStream, ScanError>;
43 | }
44 |
45 | impl DynamoDbExt for D
46 | where
47 | D: DynamoDb + Clone + Send + Sync + 'static,
48 | {
49 | fn list_backups_pages(
50 | self,
51 | input: ListBackupsInput,
52 | ) -> DynomiteStream {
53 | enum PageState {
54 | Next(Option, ListBackupsInput),
55 | End,
56 | }
57 | Box::pin(
58 | stream::try_unfold(
59 | PageState::Next(input.exclusive_start_backup_arn.clone(), input),
60 | move |state| {
61 | let clone = self.clone();
62 | async move {
63 | let (exclusive_start_backup_arn, input) = match state {
64 | PageState::Next(start, input) => (start, input),
65 | PageState::End => {
66 | return Ok(None) as Result<_, RusotoError>
67 | }
68 | };
69 | let resp = clone
70 | .list_backups(ListBackupsInput {
71 | exclusive_start_backup_arn,
72 | ..input.clone()
73 | })
74 | .await?;
75 | let next_state = match resp
76 | .last_evaluated_backup_arn
77 | .filter(|next| !next.is_empty())
78 | {
79 | Some(next) => PageState::Next(Some(next), input),
80 | _ => PageState::End,
81 | };
82 | Ok(Some((
83 | stream::iter(
84 | resp.backup_summaries
85 | .unwrap_or_default()
86 | .into_iter()
87 | .map(Ok),
88 | ),
89 | next_state,
90 | )))
91 | }
92 | },
93 | )
94 | .try_flatten(),
95 | )
96 | }
97 |
98 | fn list_tables_pages(
99 | self,
100 | input: ListTablesInput,
101 | ) -> DynomiteStream {
102 | enum PageState {
103 | Next(Option, ListTablesInput),
104 | End,
105 | }
106 | Box::pin(
107 | stream::try_unfold(
108 | PageState::Next(input.exclusive_start_table_name.clone(), input),
109 | move |state| {
110 | let clone = self.clone();
111 | async move {
112 | let (exclusive_start_table_name, input) = match state {
113 | PageState::Next(start, input) => (start, input),
114 | PageState::End => {
115 | return Ok(None) as Result<_, RusotoError>
116 | }
117 | };
118 | let resp = clone
119 | .list_tables(ListTablesInput {
120 | exclusive_start_table_name,
121 | ..input.clone()
122 | })
123 | .await?;
124 | let next_state = match resp
125 | .last_evaluated_table_name
126 | .filter(|next| !next.is_empty())
127 | {
128 | Some(next) => PageState::Next(Some(next), input),
129 | _ => PageState::End,
130 | };
131 | Ok(Some((
132 | stream::iter(resp.table_names.unwrap_or_default().into_iter().map(Ok)),
133 | next_state,
134 | )))
135 | }
136 | },
137 | )
138 | .try_flatten(),
139 | )
140 | }
141 |
142 | fn query_pages(
143 | self,
144 | input: QueryInput,
145 | ) -> DynomiteStream, QueryError> {
146 | #[allow(clippy::large_enum_variant)]
147 | enum PageState {
148 | Next(Option>, QueryInput),
149 | End,
150 | }
151 | Box::pin(
152 | stream::try_unfold(
153 | PageState::Next(input.exclusive_start_key.clone(), input),
154 | move |state| {
155 | let clone = self.clone();
156 | async move {
157 | let (exclusive_start_key, input) = match state {
158 | PageState::Next(start, input) => (start, input),
159 | PageState::End => {
160 | return Ok(None) as Result<_, RusotoError>
161 | }
162 | };
163 | let resp = clone
164 | .query(QueryInput {
165 | exclusive_start_key,
166 | ..input.clone()
167 | })
168 | .await?;
169 | let next_state =
170 | match resp.last_evaluated_key.filter(|next| !next.is_empty()) {
171 | Some(next) => PageState::Next(Some(next), input),
172 | _ => PageState::End,
173 | };
174 | Ok(Some((
175 | stream::iter(resp.items.unwrap_or_default().into_iter().map(Ok)),
176 | next_state,
177 | )))
178 | }
179 | },
180 | )
181 | .try_flatten(),
182 | )
183 | }
184 |
185 | fn scan_pages(
186 | self,
187 | input: ScanInput,
188 | ) -> DynomiteStream, ScanError> {
189 | #[allow(clippy::large_enum_variant)]
190 | enum PageState {
191 | Next(Option>, ScanInput),
192 | End,
193 | }
194 | Box::pin(
195 | stream::try_unfold(
196 | PageState::Next(input.exclusive_start_key.clone(), input),
197 | move |state| {
198 | let clone = self.clone();
199 | async move {
200 | let (exclusive_start_key, input) = match state {
201 | PageState::Next(start, input) => (start, input),
202 | PageState::End => return Ok(None) as Result<_, RusotoError>,
203 | };
204 | let resp = clone
205 | .scan(ScanInput {
206 | exclusive_start_key,
207 | ..input.clone()
208 | })
209 | .await?;
210 | let next_state =
211 | match resp.last_evaluated_key.filter(|next| !next.is_empty()) {
212 | Some(next) => PageState::Next(Some(next), input),
213 | _ => PageState::End,
214 | };
215 | Ok(Some((
216 | stream::iter(resp.items.unwrap_or_default().into_iter().map(Ok)),
217 | next_state,
218 | )))
219 | }
220 | },
221 | )
222 | .try_flatten(),
223 | )
224 | }
225 | }
226 |
--------------------------------------------------------------------------------
/dynomite/src/lib.rs:
--------------------------------------------------------------------------------
1 | //! Dynomite is the set of high-level interfaces making interacting with [AWS DynamoDB](https://aws.amazon.com/dynamodb/) more productive.
2 | //!
3 | //! 💡To learn more about DynamoDB, see [this helpful guide](https://www.dynamodbguide.com/).
4 | //!
5 | //! ## Data Modeling
6 | //!
7 | //! Dynomite adapts Rust's native types to
8 | //! DynamoDB's [core components](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html)
9 | //! to form a coherent interface.
10 | //!
11 | //! The [Attribute](trait.Attribute.html) type
12 | //! provides conversion interfaces to and from Rust's native scalar types which represent
13 | //! DynamoDB's notion of "attributes". The goal of this type is to make representing
14 | //! AWS typed values feel more natural and ergonomic in Rust. Where a conversion is not available you can implement `Attribute` for your own
15 | //! types to leverage higher level functionality.
16 | //!
17 | //! The [Item](trait.Item.html) trait
18 | //! provides conversion interfaces for complex types which represent
19 | //! DynamoDB's notion of "items".
20 | //!
21 | //! 💡 A cargo feature named `"derive"` makes it easy to derive `Item` instances for your custom types. This feature is enabled by default.
22 | //!
23 | //!
24 | //! ```rust,no_run
25 | //! use dynomite::{Item, Attributes};
26 | //! use uuid::Uuid;
27 | //!
28 | //! #[derive(Item)]
29 | //! struct Order {
30 | //! #[dynomite(partition_key)]
31 | //! user: Uuid,
32 | //! #[dynomite(sort_key)]
33 | //! order_id: Uuid,
34 | //! color: Option,
35 | //! }
36 | //! ```
37 | //!
38 | //! ## Attributes
39 | //!
40 | //! ### `#[derive(Item)]`
41 | //! Used to define a top-level DynamoDB item.
42 | //! Generates a `Key` struct with only `partition_key/sort_key`
43 | //! fields to be used for type-safe primary key construction.
44 | //! This automatically derives [`Attributes`](#deriveattributes) too.
45 | //!
46 | //! For the `Order` struct from the example higher this will generate an `OrderKey`
47 | //! struct like this:
48 | //!
49 | //! ```rust
50 | //! # use uuid::Uuid;
51 | //! # use dynomite::Attributes;
52 | //! #[derive(Attributes)]
53 | //! struct OrderKey {
54 | //! user: Uuid,
55 | //! order_id: Uuid,
56 | //! }
57 | //! ```
58 | //!
59 | //! Use it to safely and conveniently construct the primary key:
60 | //!
61 | //! ```rust
62 | //! # #[derive(dynomite::Attributes)]
63 | //! # struct Order {}
64 | //! # #[derive(Attributes)]
65 | //! # struct OrderKey {
66 | //! # user: Uuid,
67 | //! # order_id: Uuid,
68 | //! # }
69 | //! use dynomite::{
70 | //! dynamodb::{DynamoDb, GetItemInput},
71 | //! Attributes, FromAttributes,
72 | //! };
73 | //! use std::{convert::TryFrom, error::Error};
74 | //! use uuid::Uuid;
75 | //!
76 | //! async fn get_order(
77 | //! client: impl DynamoDb,
78 | //! user: Uuid,
79 | //! order_id: Uuid,
80 | //! ) -> Result, Box> {
81 | //! // Use the generated `OrderKey` struct to create a primary key
82 | //! let key = OrderKey { user, order_id };
83 | //! // Convert stronly-typed `OrderKey` to a map of `rusoto_dynamodb::AttributeValue`
84 | //! let key: Attributes = key.into();
85 | //!
86 | //! let result = client
87 | //! .get_item(GetItemInput {
88 | //! table_name: "orders".into(),
89 | //! key,
90 | //! ..Default::default()
91 | //! })
92 | //! .await?;
93 | //!
94 | //! Ok(result
95 | //! .item
96 | //! .map(|item| Order::try_from(item).expect("Invalid order, db corruption?")))
97 | //! }
98 | //! ```
99 | //!
100 | //! - `#[dynomite(partition_key)]` - required attribute, expected to be applied the target
101 | //! [partition attribute][partition-key] field with a derivable DynamoDB attribute value
102 | //! of String, Number or Binary
103 | //!
104 | //! - `#[dynomite(sort_key)]` - optional attribute, may be applied to one target
105 | //! [sort attribute](sort-key) field with an derivable DynamoDB attribute value
106 | //! of String, Number or Binary
107 | //!
108 | //! - All other attributes are the same as for [`#[derive(Attributes)]`](#deriveattributes)
109 | //!
110 | //! ### `#[derive(Attributes)]`
111 | //!
112 | //! Used to derive an implementation of `From/IntoAttributes` trait to allow for
113 | //! serializing/deserializing map-like types into [`AttributeValue`].
114 | //! This also generates `TryFrom` and `Into` implementations.
115 | //!
116 | //! - `#[dynomite(rename = "actualName")]` - optional attribute, may be applied to any item
117 | //! attribute field, useful when the DynamoDB table you're interfacing with has
118 | //! attributes whose names don't following Rust's naming conventions
119 | //!
120 | //! - `#[dynomite(skip_serializing_if = "expr_that_returns_function")]` - place this on a field
121 | //! that should be skipped in the output map entirely if the given function returns `true`.
122 | //! The value of this attribute must be a path to a function that satisfies the signature
123 | //! `FnOnce(&T) -> bool`, where `T` is the field type (possibly after some auto-deref coertions).
124 | //!
125 | //! This is is inspired by [`#[serde(skip_serializing_if = "...")]`][serde-skip-serializing-if].
126 | //!
127 | //! This attribute may be used to skip serializing the empty set for example
128 | //! (which is not supported by current DynamoDB version, but it may be in future).
129 | //!
130 | //! ```
131 | //! use dynomite::Attributes;
132 | //! use std::collections::HashSet;
133 | //!
134 | //! #[derive(Attributes)]
135 | //! struct UniqueStrings {
136 | //! #[dynomite(skip_serializing_if = "HashSet::is_empty")]
137 | //! strings: HashSet,
138 | //!
139 | //! #[dynomite(skip_serializing_if = "is_99")]
140 | //! skip_if_99: u32,
141 | //! }
142 | //!
143 | //! fn is_99(&num: &u32) -> bool {
144 | //! num == 99
145 | //! }
146 | //! ```
147 | //!
148 | //! - `#[dynomite(default)]` - use [`Default::default`] implementation of the field type
149 | //! if the attribute is absent when deserializing from `Attributes`
150 | //!
151 | //! ```
152 | //! use dynomite::Attributes;
153 | //!
154 | //! #[derive(Attributes)]
155 | //! struct Todos {
156 | //! // use Default value of the field if it is absent in DynamoDb (empty vector)
157 | //! #[dynomite(default)]
158 | //! items: Vec,
159 | //! list_name: String,
160 | //! }
161 | //! ```
162 | //!
163 | //! - `#[dynomite(flatten)]` - flattens the fields of other struct that also derives `Attributes`
164 | //! into the current struct.
165 | //!
166 | //! 💡 If this attribute is placed onto a field, no other `dynomite` attributes
167 | //! are alowed on this field (this restriction may be relaxed in future).
168 | //!
169 | //! This is reminiscent of [`#[serde(flatten)]`][serde-flatten]. The order of
170 | //! declaration of `flatten`ed fields matters, if the struct has to fields with
171 | //! `#[dynomite(flatten)]` attribute the one that appears higher in code will
172 | //! be evaluated before the other one. This is crucial when you want to collect
173 | //! additional properties into a map:
174 | //!
175 | //! ```
176 | //! use dynomite::{Attributes, Item};
177 | //!
178 | //! #[derive(Item)]
179 | //! struct ShoppingCart {
180 | //! #[dynomite(partition_key)]
181 | //! id: String,
182 | //! // A separate struct to store data without any id
183 | //! #[dynomite(flatten)]
184 | //! data: ShoppingCartData,
185 | //! // Collect all other additional attributes into a map
186 | //! // Beware that the order of declaration will affect the order of
187 | //! // evaluation, so this "wildcard" flatten clause should be the last member
188 | //! #[dynomite(flatten)]
189 | //! remaining_props: Attributes,
190 | //! }
191 | //!
192 | //! // `Attributes` doesn't require neither of #[dynomite(partition_key/sort_key)]
193 | //! #[derive(Attributes)]
194 | //! struct ShoppingCartData {
195 | //! name: String,
196 | //! total_price: u32,
197 | //! }
198 | //! ```
199 | //!
200 | //! #### Fat enums
201 | //!
202 | //! Fat enums are naturally supported by `#[derive(Attribute)]`.
203 | //! As for now, there is a limitation that the members of the enum must be
204 | //! either unit or one-element tuple variants. This restriction will be relaxed
205 | //! in future versions of `dynomite`.
206 | //!
207 | //! Deriving `Attributes` on fat enums currently uses
208 | //! [internally tagged enum pattern][internally-tagged-enum] (inspired by serde).
209 | //! Thus, you have to explicitly specify the **field name** of enum tag
210 | //! via the `tag` attribute on an enum.
211 | //!
212 | //! For example, the following definition:
213 | //!
214 | //! ```
215 | //! use dynomite::Attributes;
216 | //!
217 | //! #[derive(Attributes)]
218 | //! // Name of the field where to store the discriminant in DynamoDB
219 | //! #[dynomite(tag = "kind")]
220 | //! enum Shape {
221 | //! Rectangle(Rectangle),
222 | //! // Use `rename` to change the **value** of the tag for a particular variant
223 | //! // by default the tag for a particular variant is the name of the variant verbatim
224 | //! #[dynomite(rename = "my_circle")]
225 | //! Circle(Circle),
226 | //! Unknown,
227 | //! }
228 | //!
229 | //! #[derive(Attributes)]
230 | //! struct Circle {
231 | //! radius: u32,
232 | //! }
233 | //!
234 | //! #[derive(Attributes)]
235 | //! struct Rectangle {
236 | //! width: u32,
237 | //! height: u32,
238 | //! }
239 | //! ```
240 | //!
241 | //! corresponds to the following representation in DynamoDB for each enum variant:
242 | //!
243 | //! - `Rectangle`:
244 | //! ```json
245 | //! {
246 | //! "kind": "Rectangle",
247 | //! "width": 42,
248 | //! "height": 64
249 | //! }
250 | //! ```
251 | //! - `Circle`:
252 | //! ```json
253 | //! {
254 | //! "kind": "my_circle",
255 | //! "radius": 54
256 | //! }
257 | //! ```
258 | //! - `Unknown`:
259 | //! ```json
260 | //! {
261 | //! "kind": "Unknown"
262 | //! }
263 | //! ```
264 | //!
265 | //! If you have a plain old enum (without any data fields), you should use
266 | //! [`#[derive(Attribute)]`](#deriveattribute) instead.
267 | //!
268 | //! ### `#[derive(Attribute)]`
269 | //!
270 | //! Derives an implementation of [`Attribute`] for the plain enum.
271 | //! If you want to use a fat enum see [this paragraph](#fat-enums) instead.
272 | //!
273 | //! The enum istelf will be represented as a string with the name of the variant
274 | //! it represents.
275 | //! In contrast, having [`#[derive(Attributes)]`](deriveattributes) on an enum
276 | //! makes it to be represented as an object with a tag field,
277 | //! which implies an additional layer of indirection.
278 | //!
279 | //! ```
280 | //! use dynomite::{Attribute, Item};
281 | //!
282 | //! #[derive(Attribute)]
283 | //! enum UserRole {
284 | //! Admin,
285 | //! Moderator,
286 | //! Regular,
287 | //! }
288 | //!
289 | //! #[derive(Item)]
290 | //! struct User {
291 | //! #[dynomite(partition_key)]
292 | //! id: String,
293 | //! role: UserRole,
294 | //! }
295 | //! ```
296 | //!
297 | //! This data model will have the following representation in DynamoDB:
298 | //!
299 | //! ```json
300 | //! {
301 | //! "id": "d97de525-c81d-46d4-b945-d01b3a0f9165",
302 | //! "role": "Admin"
303 | //! }
304 | //! ```
305 | //!
306 | //! `role` field here may be any of `Admin`, `Moderator`, or `Regular` strings.
307 | //!
308 | //! ## Rusoto extensions
309 | //!
310 | //! By importing the [dynomite::DynamoDbExt](trait.DynamoDbExt.html) trait, dynomite
311 | //! adds client interfaces for creating async Stream-based auto pagination interfaces.
312 | //!
313 | //! ## Robust retries
314 | //!
315 | //! By importing the [dynomite::Retries](retry/trait.Retries.html) trait, dynomite
316 | //! provides an interface for adding configuration retry policies so your
317 | //! rusoto DynamoDb clients.
318 | //!
319 | //! # Errors
320 | //!
321 | //! Some operations which require coercion from AWS to Rust types may fail which results in an
322 | //! [AttributeError](error/enum.AttributeError.html).
323 | //!
324 | //! # Cargo Features
325 | //!
326 | //! This crate has a few cargo features of note.
327 | //!
328 | //! ## uuid
329 | //!
330 | //! Enabled by default, the `uuid` feature adds support for implementing `Attribute` for
331 | //! the [uuid](https://crates.io/crates/uuid) crate's type `Uuid`, a useful
332 | //! type for producing and representing
333 | //! unique identifiers for items that satisfy [effective characteristics for partition keys](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/bp-partition-key-design.html)
334 | //!
335 | //! ## chrono
336 | //!
337 | //! Enabled by default, the `chrono` feature adds an implementation of `Attribute` for
338 | //! the std's [SystemTime](https://doc.rust-lang.org/std/time/struct.SystemTime.html) and chrono [`DateTime`](https://docs.rs/chrono/0.4.11/chrono/struct.DateTime.html) types which
339 | //! internally use [rfc3339 timestamps](https://www.ietf.org/rfc/rfc3339.txt).
340 | //!
341 | //! ## derive
342 | //!
343 | //! Enabled by default, the `derive` feature enables the use of the dynomite derive feature which
344 | //! allows you to simply add `#[derive(Item)]` to your structs.
345 | //!
346 | //! ## rustls
347 | //!
348 | //! Disabled by default, the `rustls` feature overrides Rusoto's default tls
349 | //! dependency on OpenSSL, replacing it with a [`rustls`](https://crates.io/crates/rustls) based tls implementation. When you
350 | //! enable this feature. It will also enable `uuid` and `derive` by default.
351 | //!
352 | //! To disable any of these features
353 | //!
354 | //! ```toml
355 | //! [dependencies.dynomite]
356 | //! version = "xxx"
357 | //! default-features = false
358 | //! features = ["feature-you-want"]
359 | //! ```
360 | //!
361 | //! [partition-key]: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html#HowItWorks.CoreComponents.PrimaryKey
362 | //! [sort-key]: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html#HowItWorks.CoreComponents.SecondaryIndexes
363 | //! [internally-tagged-enum]: https://serde.rs/enum-representations.html#internally-tagged
364 | //! [`Default::default`]: https://doc.rust-lang.org/stable/std/default/trait.Default.html#tymethod.default
365 | //! [`AttributeValue`]: https://docs.rs/rusoto_dynamodb/*/rusoto_dynamodb/struct.AttributeValue.html
366 | //! [`Attribute`]: trait.Attribute.html
367 | //! [serde-skip-serializing-if]: https://serde.rs/attr-skip-serializing.html
368 | //! [serde-flatten]: https://serde.rs/attr-flatten.html
369 |
370 | #![deny(missing_docs)]
371 | // reexported
372 | // note: this is used inside the attr_map! macro
373 | // #[cfg(feature = "default")]
374 | // pub use rusoto_dynamodb_default as dynamodb;
375 | //
376 | // #[cfg(feature = "rustls")]
377 | // pub use rusoto_dynamodb_rustls as dynamodb;
378 |
379 | use bytes::Bytes;
380 | #[cfg(feature = "chrono")]
381 | use chrono::{
382 | offset::{FixedOffset, Local},
383 | DateTime, Utc,
384 | };
385 | pub use rusoto_dynamodb as dynamodb;
386 |
387 | // we re-export this because we
388 | // refer to it with in derive macros
389 | #[doc(hidden)]
390 | pub use dynamodb::AttributeValue;
391 | use std::{
392 | borrow::Cow,
393 | collections::{BTreeMap, BTreeSet, HashMap, HashSet},
394 | time::SystemTime,
395 | };
396 | #[cfg(feature = "uuid")]
397 | use uuid::Uuid;
398 |
399 | pub mod error;
400 | mod ext;
401 | pub mod retry;
402 |
403 | pub use crate::{ext::DynamoDbExt, retry::Retries};
404 |
405 | pub use crate::error::AttributeError;
406 | /// Type alias for map of named attribute values
407 | pub type Attributes = HashMap;
408 |
409 | /// A type which can be converted to and from a set of String keys and
410 | /// `AttributeValues`.
411 | ///
412 | /// # Examples
413 | ///
414 | /// Below is an example of doing this manually for demonstration.
415 | ///
416 | /// ```
417 | /// use dynomite::{
418 | /// dynamodb::AttributeValue, Attribute, AttributeError, Attributes, FromAttributes,
419 | /// IntoAttributes, Item,
420 | /// };
421 | /// use std::{collections::HashMap, convert::TryFrom};
422 | ///
423 | /// #[derive(PartialEq, Debug, Clone)]
424 | /// struct Person {
425 | /// id: String,
426 | /// }
427 | ///
428 | /// impl Item for Person {
429 | /// fn key(&self) -> Attributes {
430 | /// let mut attrs = HashMap::new();
431 | /// attrs.insert("id".into(), "123".to_string().into_attr());
432 | /// attrs
433 | /// }
434 | /// }
435 | ///
436 | /// impl FromAttributes for Person {
437 | /// fn from_attrs(attrs: &mut Attributes) -> Result {
438 | /// Ok(Self {
439 | /// id: attrs
440 | /// .remove("id")
441 | /// .and_then(|val| val.s)
442 | /// .ok_or_else(|| AttributeError::MissingField { name: "id".into() })?,
443 | /// })
444 | /// }
445 | /// }
446 | ///
447 | /// impl IntoAttributes for Person {
448 | /// fn into_attrs(
449 | /// self,
450 | /// attrs: &mut Attributes,
451 | /// ) {
452 | /// attrs.insert("id".into(), "123".to_string().into_attr());
453 | /// }
454 | /// }
455 | ///
456 | /// // Unfortunately `dynomite` is not able to provide a blanket impl for std::convert traits
457 | /// // due to orphan rules, but they are generated via the `dynomite_derive` attributes
458 | ///
459 | /// impl TryFrom for Person {
460 | /// type Error = AttributeError;
461 | ///
462 | /// fn try_from(mut attrs: Attributes) -> Result {
463 | /// Person::from_attrs(&mut attrs)
464 | /// }
465 | /// }
466 | ///
467 | /// impl From for Attributes {
468 | /// fn from(person: Person) -> Attributes {
469 | /// let mut map = HashMap::new();
470 | /// person.into_attrs(&mut map);
471 | /// map
472 | /// }
473 | /// }
474 | ///
475 | /// let person = Person { id: "123".into() };
476 | /// let attrs: Attributes = person.clone().into();
477 | /// assert_eq!(Ok(person), Person::try_from(attrs))
478 | /// ```
479 | ///
480 | /// You can get this all for free automatically using `#[derive(Item)]` on your structs. This is the recommended approach.
481 | ///
482 | /// ```
483 | /// use dynomite::Item;
484 | /// #[derive(Item)]
485 | /// struct Book {
486 | /// #[dynomite(partition_key)]
487 | /// id: String,
488 | /// }
489 | /// ```
490 | ///
491 | /// ## Renaming fields
492 | ///
493 | /// In some cases you may be dealing with a DynamoDB table whose
494 | /// fields are named using conventions that do not align with Rust's conventions.
495 | /// You can leverage the `rename` attribute to map Rust's fields back to its source name
496 | /// explicitly
497 | ///
498 | /// ```
499 | /// use dynomite::Item;
500 | ///
501 | /// #[derive(Item)]
502 | /// struct Book {
503 | /// #[dynomite(partition_key)]
504 | /// id: String,
505 | /// #[dynomite(rename = "notConventional")]
506 | /// not_conventional: String,
507 | /// }
508 | /// ```
509 | ///
510 | /// ## Accommodating sparse data
511 | ///
512 | /// In some cases you may be dealing with a DynamoDB table whose
513 | /// fields are absent for some records. This is different than fields whose records
514 | /// have `NULL` attribute type values. In these cases you can use the `default` field
515 | /// attribute to communicate that the `std::default::Default::default()` value for the fields
516 | /// type will be used in the absence of data.
517 | ///
518 | /// ```
519 | /// use dynomite::Item;
520 | ///
521 | /// #[derive(Item)]
522 | /// struct Book {
523 | /// #[dynomite(partition_key)]
524 | /// id: String,
525 | /// #[dynomite(default)]
526 | /// summary: Option,
527 | /// }
528 | /// ```
529 | ///
530 | /// ## Item attribute projections
531 | ///
532 | /// DynamoDB `Item`s are a set of attributes with a uniquely identifying
533 | /// partition key. At times, you may wish to project over these attributes into a type
534 | /// that does not include a partition_key. For that specific purpose, instead of
535 | /// deriving an `Item` type you'll want to derive `Attributes`
536 | ///
537 | /// ```
538 | /// use dynomite::Attributes;
539 | ///
540 | /// #[derive(Attributes)]
541 | /// struct BookProjection {
542 | /// author: String,
543 | /// #[dynomite(default)]
544 | /// summary: Option
545 | /// }
546 | pub trait Item: IntoAttributes + FromAttributes {
547 | /// Returns the set of attributes which make up this item's primary key
548 | ///
549 | /// This is often used in item look ups
550 | fn key(&self) -> Attributes;
551 | }
552 |
553 | /// A type capable of being converted into an or from and AWS `AttributeValue`
554 | ///
555 | /// Default implementations of this are provided for each type of `AttributeValue` field
556 | /// which map to naturally fitting native Rustlang types.
557 | ///
558 | /// # Examples
559 | ///
560 | /// ```
561 | /// use dynomite::{dynamodb::AttributeValue, Attribute};
562 | ///
563 | /// assert_eq!(
564 | /// "test".to_string().into_attr().s,
565 | /// AttributeValue {
566 | /// s: Some("test".to_string()),
567 | /// ..AttributeValue::default()
568 | /// }
569 | /// .s
570 | /// );
571 | /// ```
572 | pub trait Attribute: Sized {
573 | /// Returns a conversion into an `AttributeValue`
574 | fn into_attr(self) -> AttributeValue;
575 | /// Returns a fallible conversion from an `AttributeValue`
576 | fn from_attr(value: AttributeValue) -> Result;
577 | }
578 |
579 | impl Attribute for AttributeValue {
580 | fn into_attr(self) -> AttributeValue {
581 | self
582 | }
583 | fn from_attr(value: AttributeValue) -> Result {
584 | Ok(value)
585 | }
586 | }
587 |
588 | /// A type capable of being produced from a set of string keys and [`AttributeValue`]s.
589 | /// Generally, you should not implement this trait manually.
590 | /// Use `#[derive(Attributes/Item)]` to generate the proper implementation instead.
591 | ///
592 | /// [`AttributeValue`]: https://docs.rs/rusoto_dynamodb/*/rusoto_dynamodb/struct.AttributeValue.html
593 | pub trait FromAttributes: Sized {
594 | /// Returns an instance of of a type resolved at runtime from a collection
595 | /// of a `String` keys and [`AttributeValue`]s.
596 | /// If an instance can not be resolved and `AttributeError` will be returned.
597 | /// The implementations of this method should remove the relevant key-value
598 | /// pairs from the map to consume them.
599 | ///
600 | /// This is needed to support `#[dynomite(flatten)]` without creating temporary hash maps.
601 | ///
602 | /// [`AttributeValue`]: https://docs.rs/rusoto_dynamodb/*/rusoto_dynamodb/struct.AttributeValue.html
603 | fn from_attrs(attrs: &mut Attributes) -> Result;
604 | }
605 |
606 | /// Coerces a homogeneous HashMap of attribute values into a homogeneous Map of types
607 | /// that implement `Attribute`
608 | #[allow(clippy::implicit_hasher)]
609 | impl FromAttributes for HashMap {
610 | fn from_attrs(attrs: &mut Attributes) -> Result {
611 | attrs
612 | .drain()
613 | .map(|(k, v)| Ok((k, A::from_attr(v)?)))
614 | .collect()
615 | }
616 | }
617 |
618 | /// Coerces a homogenious Map of attribute values into a homogeneous BTreeMap of types
619 | /// that implement Attribute
620 | impl FromAttributes for BTreeMap {
621 | fn from_attrs(attrs: &mut Attributes) -> Result {
622 | attrs
623 | .drain()
624 | .map(|(k, v)| Ok((k, A::from_attr(v)?)))
625 | .collect()
626 | }
627 | }
628 |
629 | /// A type capable of being serialized into a set of string keys and [`AttributeValue`]s
630 | /// Generally, you should not implement this trait manually.
631 | /// Use `#[derive(Attributes/Item)]` to generate the proper implementation instead.
632 | ///
633 | /// It also generates `From for Attributes` for your type
634 | /// (there is no blanket impl for `From` here due to orphan rules)
635 | ///
636 | /// [`AttributeValue`]: https://docs.rs/rusoto_dynamodb/*/rusoto_dynamodb/struct.AttributeValue.html
637 | pub trait IntoAttributes: Sized {
638 | /// Converts `self` into `Attributes` by accepting a `sink` argument and
639 | /// insterting attribute key-value pairs into it.
640 | /// This is needed to support `#[dynomite(flatten)]` without creating
641 | /// temporary hash maps.
642 | fn into_attrs(
643 | self,
644 | sink: &mut Attributes,
645 | );
646 | }
647 |
648 | impl IntoAttributes for HashMap {
649 | fn into_attrs(
650 | self,
651 | sink: &mut Attributes,
652 | ) {
653 | sink.extend(self.into_iter().map(|(k, v)| (k, v.into_attr())));
654 | }
655 | }
656 |
657 | impl IntoAttributes for BTreeMap {
658 | fn into_attrs(
659 | self,
660 | sink: &mut Attributes,
661 | ) {
662 | sink.extend(self.into_iter().map(|(k, v)| (k, v.into_attr())));
663 | }
664 | }
665 |
666 | /// A Map type for all hash-map-like values, represented as the `M` AttributeValue type
667 | impl Attribute for T {
668 | fn into_attr(self) -> AttributeValue {
669 | let mut map = HashMap::new();
670 | self.into_attrs(&mut map);
671 | AttributeValue {
672 | m: Some(map),
673 | ..AttributeValue::default()
674 | }
675 | }
676 | fn from_attr(value: AttributeValue) -> Result {
677 | T::from_attrs(&mut value.m.ok_or(AttributeError::InvalidType)?)
678 | }
679 | }
680 |
681 | /// A `String` type for `Uuids`, represented by the `S` AttributeValue type
682 | #[cfg(feature = "uuid")]
683 | impl Attribute for Uuid {
684 | fn into_attr(self) -> AttributeValue {
685 | AttributeValue {
686 | s: Some(self.to_hyphenated().to_string()),
687 | ..AttributeValue::default()
688 | }
689 | }
690 | fn from_attr(value: AttributeValue) -> Result {
691 | value
692 | .s
693 | .ok_or(AttributeError::InvalidType)
694 | .and_then(|s| Uuid::parse_str(s.as_str()).map_err(|_| AttributeError::InvalidFormat))
695 | }
696 | }
697 |
698 | /// An `rfc3339` formatted version of `DateTime`, represented by the `S` AttributeValue type
699 | #[cfg(feature = "chrono")]
700 | impl Attribute for DateTime {
701 | fn into_attr(self) -> AttributeValue {
702 | AttributeValue {
703 | s: Some(self.to_rfc3339()),
704 | ..Default::default()
705 | }
706 | }
707 | fn from_attr(value: AttributeValue) -> Result {
708 | value
709 | .s
710 | .ok_or(AttributeError::InvalidType)
711 | .and_then(
712 | |s| match DateTime::parse_from_rfc3339(&s).map(|dt| dt.with_timezone(&Utc)) {
713 | Ok(date_time) => Ok(date_time),
714 | Err(_) => Err(AttributeError::InvalidFormat),
715 | },
716 | )
717 | }
718 | }
719 |
720 | /// An `rfc3339` formatted version of `DateTime`, represented by the `S` AttributeValue type
721 | #[cfg(feature = "chrono")]
722 | impl Attribute for DateTime {
723 | fn into_attr(self) -> AttributeValue {
724 | AttributeValue {
725 | s: Some(self.to_rfc3339()),
726 | ..Default::default()
727 | }
728 | }
729 | fn from_attr(value: AttributeValue) -> Result {
730 | value
731 | .s
732 | .ok_or(AttributeError::InvalidType)
733 | .and_then(|s| {
734 | match DateTime::parse_from_rfc3339(&s).map(|dt| dt.with_timezone(&Local)) {
735 | Ok(date_time) => Ok(date_time),
736 | Err(_) => Err(AttributeError::InvalidFormat),
737 | }
738 | })
739 | }
740 | }
741 |
742 | /// An `rfc3339` formatted version of `DateTime`, represented by the `S` AttributeValue type
743 | #[cfg(feature = "chrono")]
744 | impl Attribute for DateTime {
745 | fn into_attr(self) -> AttributeValue {
746 | AttributeValue {
747 | s: Some(self.to_rfc3339()),
748 | ..Default::default()
749 | }
750 | }
751 | fn from_attr(value: AttributeValue) -> Result {
752 | value
753 | .s
754 | .ok_or(AttributeError::InvalidType)
755 | .and_then(|s| match DateTime::parse_from_rfc3339(&s) {
756 | Ok(date_time) => Ok(date_time),
757 | Err(_) => Err(AttributeError::InvalidFormat),
758 | })
759 | }
760 | }
761 |
762 | /// An `rfc3339` formatted version of `SystemTime`, represented by the `S` AttributeValue type
763 | #[cfg(feature = "chrono")]
764 | impl Attribute for SystemTime {
765 | fn into_attr(self) -> AttributeValue {
766 | let dt: DateTime = self.into();
767 | dt.into_attr()
768 | }
769 | fn from_attr(value: AttributeValue) -> Result {
770 | value
771 | .s
772 | .ok_or(AttributeError::InvalidType)
773 | .and_then(|s| match DateTime::parse_from_rfc3339(&s) {
774 | Ok(date_time) => Ok(date_time.into()),
775 | Err(_) => Err(AttributeError::InvalidFormat),
776 | })
777 | }
778 | }
779 |
780 | /// A `String` type, represented by the S AttributeValue type
781 | impl Attribute for String {
782 | fn into_attr(self) -> AttributeValue {
783 | AttributeValue {
784 | s: Some(self),
785 | ..AttributeValue::default()
786 | }
787 | }
788 | fn from_attr(value: AttributeValue) -> Result {
789 | value.s.ok_or(AttributeError::InvalidType)
790 | }
791 | }
792 |
793 | impl<'a> Attribute for Cow<'a, str> {
794 | fn into_attr(self) -> AttributeValue {
795 | AttributeValue {
796 | s: Some(match self {
797 | Cow::Owned(o) => o,
798 | Cow::Borrowed(b) => b.to_owned(),
799 | }),
800 | ..AttributeValue::default()
801 | }
802 | }
803 | fn from_attr(value: AttributeValue) -> Result {
804 | value.s.map(Cow::Owned).ok_or(AttributeError::InvalidType)
805 | }
806 | }
807 |
808 | /// A String Set type, represented by the SS AttributeValue type
809 | #[allow(clippy::implicit_hasher)]
810 | impl Attribute for HashSet {
811 | fn into_attr(mut self) -> AttributeValue {
812 | AttributeValue {
813 | ss: Some(self.drain().collect()),
814 | ..AttributeValue::default()
815 | }
816 | }
817 | fn from_attr(value: AttributeValue) -> Result {
818 | value
819 | .ss
820 | .ok_or(AttributeError::InvalidType)
821 | .map(|mut value| value.drain(..).collect())
822 | }
823 | }
824 |
825 | impl Attribute for BTreeSet {
826 | fn into_attr(self) -> AttributeValue {
827 | AttributeValue {
828 | ss: Some(self.into_iter().collect()),
829 | ..AttributeValue::default()
830 | }
831 | }
832 | fn from_attr(value: AttributeValue) -> Result {
833 | value
834 | .ss
835 | .ok_or(AttributeError::InvalidType)
836 | .map(|mut value| value.drain(..).collect())
837 | }
838 | }
839 |
840 | /// A Binary Set type, represented by the BS AttributeValue type
841 | #[allow(clippy::implicit_hasher)]
842 | impl Attribute for HashSet> {
843 | fn into_attr(mut self) -> AttributeValue {
844 | AttributeValue {
845 | bs: Some(self.drain().map(Bytes::from).collect()),
846 | ..AttributeValue::default()
847 | }
848 | }
849 | fn from_attr(value: AttributeValue) -> Result {
850 | value
851 | .bs
852 | .ok_or(AttributeError::InvalidType)
853 | .map(|mut value| value.drain(..).map(|bs| bs.as_ref().to_vec()).collect())
854 | }
855 | }
856 |
857 | // a Boolean type, represented by the BOOL AttributeValue type
858 | impl Attribute for bool {
859 | fn into_attr(self) -> AttributeValue {
860 | AttributeValue {
861 | bool: Some(self),
862 | ..AttributeValue::default()
863 | }
864 | }
865 | fn from_attr(value: AttributeValue) -> Result {
866 | value.bool.ok_or(AttributeError::InvalidType)
867 | }
868 | }
869 |
870 | // a Binary type, represented by the B AttributeValue type
871 | impl Attribute for bytes::Bytes {
872 | fn into_attr(self) -> AttributeValue {
873 | AttributeValue {
874 | b: Some(self),
875 | ..AttributeValue::default()
876 | }
877 | }
878 | fn from_attr(value: AttributeValue) -> Result {
879 | value.b.ok_or(AttributeError::InvalidType)
880 | }
881 | }
882 |
883 | // a Binary type, represented by the B AttributeValue type
884 | impl Attribute for Vec {
885 | fn into_attr(self) -> AttributeValue {
886 | AttributeValue {
887 | b: Some(self.into()),
888 | ..AttributeValue::default()
889 | }
890 | }
891 | fn from_attr(value: AttributeValue) -> Result {
892 | value
893 | .b
894 | .ok_or(AttributeError::InvalidType)
895 | .map(|bs| bs.as_ref().to_vec())
896 | }
897 | }
898 |
899 | /// A List type for vectors, represented by the L AttributeValue type
900 | ///
901 | /// Note: Vectors support homogenious collection values. This means
902 | /// the default supported scalars do not permit cases where you need
903 | /// to store a list of heterogenus values. To accomplish this you'll need
904 | /// to implement a wrapper type that represents your desired variants
905 | /// and implement `Attribute` for `YourType`. An `Vec` implementation
906 | /// will already be provided
907 | impl Attribute for Vec {
908 | fn into_attr(mut self) -> AttributeValue {
909 | AttributeValue {
910 | l: Some(self.drain(..).map(|s| s.into_attr()).collect()),
911 | ..AttributeValue::default()
912 | }
913 | }
914 | fn from_attr(value: AttributeValue) -> Result {
915 | value
916 | .l
917 | .ok_or(AttributeError::InvalidType)?
918 | .into_iter()
919 | .map(Attribute::from_attr)
920 | .collect()
921 | }
922 | }
923 |
924 | impl Attribute for Option {
925 | fn into_attr(self) -> AttributeValue {
926 | match self {
927 | Some(value) => value.into_attr(),
928 | _ => AttributeValue {
929 | null: Some(true),
930 | ..Default::default()
931 | },
932 | }
933 | }
934 | fn from_attr(value: AttributeValue) -> Result {
935 | match value.null {
936 | Some(true) => Ok(None),
937 | _ => Ok(Some(Attribute::from_attr(value)?)),
938 | }
939 | }
940 | }
941 |
942 | macro_rules! numeric_attr {
943 | ($type:ty) => {
944 | impl Attribute for $type {
945 | fn into_attr(self) -> AttributeValue {
946 | AttributeValue {
947 | n: Some(self.to_string()),
948 | ..AttributeValue::default()
949 | }
950 | }
951 | fn from_attr(value: AttributeValue) -> Result {
952 | value
953 | .n
954 | .ok_or(AttributeError::InvalidType)
955 | .and_then(|num| num.parse().map_err(|_| AttributeError::InvalidFormat))
956 | }
957 | }
958 | };
959 | }
960 |
961 | macro_rules! numeric_set_attr {
962 | ($type:ty => $collection:ty) => {
963 | /// A Number set type, represented by the NS AttributeValue type
964 | impl Attribute for $collection {
965 | fn into_attr(self) -> crate::AttributeValue {
966 | AttributeValue {
967 | ns: Some(self.iter().map(|item| item.to_string()).collect()),
968 | ..AttributeValue::default()
969 | }
970 | }
971 | fn from_attr(value: AttributeValue) -> Result {
972 | let mut nums = value.ns.ok_or(AttributeError::InvalidType)?;
973 | let mut results: Vec> = nums
974 | .drain(..)
975 | .map(|ns| ns.parse().map_err(|_| AttributeError::InvalidFormat))
976 | .collect();
977 | results.drain(..).collect()
978 | }
979 | }
980 | };
981 | }
982 |
983 | // implement Attribute for numeric types
984 | numeric_attr!(u16);
985 | numeric_attr!(i16);
986 | numeric_attr!(u32);
987 | numeric_attr!(i32);
988 | numeric_attr!(u64);
989 | numeric_attr!(i64);
990 | numeric_attr!(f32);
991 | numeric_attr!(f64);
992 |
993 | // implement Attribute for numeric collections
994 | numeric_set_attr!(u16 => HashSet);
995 | numeric_set_attr!(u16 => BTreeSet);
996 | numeric_set_attr!(i16 => HashSet);
997 | numeric_set_attr!(i16 => BTreeSet);
998 |
999 | numeric_set_attr!(u32 => HashSet);
1000 | numeric_set_attr!(u32 => BTreeSet);
1001 | numeric_set_attr!(i32 => HashSet);
1002 | numeric_set_attr!(i32 => BTreeSet);
1003 |
1004 | numeric_set_attr!(i64 => HashSet);
1005 | numeric_set_attr!(i64 => BTreeSet);
1006 | numeric_set_attr!(u64 => HashSet);
1007 | numeric_set_attr!(u64 => BTreeSet);
1008 |
1009 | // note floats don't implement `Ord` and thus can't
1010 | // be used in various XXXSet types
1011 | //numeric_set_attr!(f32 => HashSet);
1012 | //numeric_set_attr!(f32 => BTreeSet);
1013 | //numeric_set_attr!(f64 => HashSet);
1014 | //numeric_set_attr!(f64 => BTreeSet);
1015 |
1016 | #[macro_export]
1017 | /// Creates a `HashMap` from a list of key-value pairs
1018 | ///
1019 | /// This provides some convenience for some interfaces,
1020 | /// like [query](../rusoto_dynamodb/struct.QueryInput.html#structfield.expression_attribute_values)
1021 | /// where a map of this type is required.
1022 | ///
1023 | /// This syntax for this macro is the same as [maplit](https://crates.io/crates/maplit).
1024 | ///
1025 | /// A avoid using `&str` slices for values when creating a mapping for a `String` `AttributeValue`.
1026 | /// Instead use a `String`.
1027 | ///
1028 | /// ## Example
1029 | ///
1030 | /// ```
1031 | /// use dynomite::dynamodb::QueryInput;
1032 | /// use dynomite::attr_map;
1033 | ///
1034 | /// let query = QueryInput {
1035 | /// table_name: "some_table".into(),
1036 | /// key_condition_expression: Some(
1037 | /// "partitionKeyName = :partitionkeyval".into()
1038 | /// ),
1039 | /// expression_attribute_values: Some(
1040 | /// attr_map! {
1041 | /// ":partitionkeyval" => "rust".to_string()
1042 | /// }
1043 | /// ),
1044 | /// ..QueryInput::default()
1045 | /// };
1046 | macro_rules! attr_map {
1047 | (@single $($x:tt)*) => (());
1048 | (@count $($rest:expr),*) => (<[()]>::len(&[$($crate::attr_map!(@single $rest)),*]));
1049 | ($($key:expr => $value:expr,)+) => { $crate::attr_map!($($key => $value),+) };
1050 | ($($key:expr => $value:expr),*) => {
1051 | {
1052 | let _cap = $crate::attr_map!(@count $($key),*);
1053 | let mut _map: ::std::collections::HashMap =
1054 | ::std::collections::HashMap::with_capacity(_cap);
1055 | {
1056 | use ::dynomite::Attribute;
1057 | $(
1058 | let _ = _map.insert($key.into(), $value.into_attr());
1059 | )*
1060 | }
1061 | _map
1062 | }
1063 | };
1064 | }
1065 |
1066 | // Re-export #[derive(Item)]
1067 | // work around for 2018 edition issue with needing to
1068 | // import but the use dynomite::Item and dynomite_derive::Item
1069 | // https://internals.rust-lang.org/t/2018-edition-custom-derives-and-shadowy-import-ux/9097
1070 | #[cfg(feature = "derive")]
1071 | #[allow(unused_imports)]
1072 | #[macro_use]
1073 | extern crate dynomite_derive;
1074 | #[cfg(feature = "derive")]
1075 | #[doc(hidden)]
1076 | pub use dynomite_derive::*;
1077 |
1078 | #[cfg(test)]
1079 | mod test {
1080 | use super::*;
1081 | use maplit::{btreemap, btreeset, hashmap};
1082 |
1083 | #[test]
1084 | fn uuid_attr() {
1085 | let value = Uuid::new_v4();
1086 | assert_eq!(Ok(value), Uuid::from_attr(value.into_attr()));
1087 | }
1088 |
1089 | #[test]
1090 | fn uuid_invalid_attr() {
1091 | assert_eq!(
1092 | Err(AttributeError::InvalidType),
1093 | Uuid::from_attr(AttributeValue {
1094 | bool: Some(true),
1095 | ..AttributeValue::default()
1096 | })
1097 | );
1098 | }
1099 |
1100 | #[test]
1101 | #[cfg(feature = "chrono")]
1102 | fn chrono_datetime_utc_attr() {
1103 | let value = Utc::now();
1104 | assert_eq!(Ok(value), DateTime::::from_attr(value.into_attr()));
1105 | }
1106 |
1107 | #[test]
1108 | #[cfg(feature = "chrono")]
1109 | fn chrono_datetime_invalid_utc_attr() {
1110 | assert_eq!(
1111 | Err(AttributeError::InvalidType),
1112 | DateTime::::from_attr(AttributeValue {
1113 | bool: Some(true),
1114 | ..AttributeValue::default()
1115 | })
1116 | );
1117 | }
1118 |
1119 | #[test]
1120 | #[cfg(feature = "chrono")]
1121 | fn chrono_datetime_local_attr() {
1122 | let value = Local::now();
1123 | assert_eq!(Ok(value), DateTime::::from_attr(value.into_attr()));
1124 | }
1125 |
1126 | #[test]
1127 | #[cfg(feature = "chrono")]
1128 | fn chrono_datetime_invalid_local_attr() {
1129 | assert_eq!(
1130 | Err(AttributeError::InvalidType),
1131 | DateTime::::from_attr(AttributeValue {
1132 | bool: Some(true),
1133 | ..AttributeValue::default()
1134 | })
1135 | );
1136 | }
1137 |
1138 | #[test]
1139 | #[cfg(feature = "chrono")]
1140 | fn chrono_datetime_fixedoffset_attr() {
1141 | use chrono::offset::TimeZone;
1142 | let value = FixedOffset::east(5 * 3600)
1143 | .ymd(2015, 2, 18)
1144 | .and_hms(23, 16, 9);
1145 | assert_eq!(
1146 | Ok(value),
1147 | DateTime::::from_attr(value.into_attr())
1148 | );
1149 | }
1150 |
1151 | #[test]
1152 | #[cfg(feature = "chrono")]
1153 | fn chrono_datetime_invalid_fixedoffset_attr() {
1154 | assert_eq!(
1155 | Err(AttributeError::InvalidType),
1156 | DateTime::::from_attr(AttributeValue {
1157 | bool: Some(true),
1158 | ..AttributeValue::default()
1159 | })
1160 | );
1161 | }
1162 |
1163 | #[test]
1164 | #[cfg(feature = "chrono")]
1165 | fn system_time_attr() {
1166 | use std::time::SystemTime;
1167 | let value = SystemTime::now();
1168 | assert_eq!(Ok(value), SystemTime::from_attr(value.into_attr()));
1169 | }
1170 |
1171 | #[test]
1172 | #[cfg(feature = "chrono")]
1173 | fn system_time_invalid_attr() {
1174 | use std::time::SystemTime;
1175 | assert_eq!(
1176 | Err(AttributeError::InvalidType),
1177 | SystemTime::from_attr(AttributeValue {
1178 | bool: Some(true),
1179 | ..AttributeValue::default()
1180 | })
1181 | );
1182 | }
1183 |
1184 | #[test]
1185 | fn option_some_attr() {
1186 | let value = Some(1);
1187 | assert_eq!(Ok(value), Attribute::from_attr(value.into_attr()));
1188 | }
1189 |
1190 | #[test]
1191 | fn option_none_attr() {
1192 | let value: Option = None;
1193 | assert_eq!(Ok(value), Attribute::from_attr(value.into_attr()));
1194 | }
1195 |
1196 | #[test]
1197 | fn option_invalid_attr() {
1198 | assert_eq!(
1199 | Err(AttributeError::InvalidType),
1200 | Option::::from_attr(AttributeValue {
1201 | bool: Some(true),
1202 | ..AttributeValue::default()
1203 | })
1204 | );
1205 | }
1206 |
1207 | #[test]
1208 | fn bool_attr() {
1209 | let value = true;
1210 | assert_eq!(Ok(value), bool::from_attr(value.into_attr()));
1211 | }
1212 |
1213 | #[test]
1214 | fn string_attr() {
1215 | let value = "test".to_string();
1216 | assert_eq!(Ok(value.clone()), String::from_attr(value.into_attr()));
1217 | }
1218 |
1219 | #[test]
1220 | fn bytes_attr_from_attr() {
1221 | let value = Bytes::from("test");
1222 | assert_eq!(Ok(value.clone()), Bytes::from_attr(value.into_attr()));
1223 | }
1224 |
1225 | #[test]
1226 | fn byte_vec_attr_from_attr() {
1227 | let value = b"test".to_vec();
1228 | assert_eq!(Ok(value.clone()), Vec::::from_attr(value.into_attr()));
1229 | }
1230 |
1231 | #[test]
1232 | fn numeric_into_attr() {
1233 | assert_eq!(
1234 | serde_json::to_string(&1.into_attr()).unwrap(),
1235 | r#"{"N":"1"}"#
1236 | );
1237 | }
1238 |
1239 | #[test]
1240 | fn numeric_from_attr() {
1241 | assert_eq!(
1242 | Attribute::from_attr(serde_json::from_str::(r#"{"N":"1"}"#).unwrap()),
1243 | Ok(1)
1244 | );
1245 | }
1246 |
1247 | #[test]
1248 | fn string_into_attr() {
1249 | assert_eq!(
1250 | serde_json::to_string(&"foo".to_string().into_attr()).unwrap(),
1251 | r#"{"S":"foo"}"#
1252 | );
1253 | }
1254 |
1255 | #[test]
1256 | fn string_from_attr() {
1257 | assert_eq!(
1258 | Attribute::from_attr(serde_json::from_str::(r#"{"S":"foo"}"#).unwrap()),
1259 | Ok("foo".to_string())
1260 | );
1261 | }
1262 |
1263 | #[test]
1264 | fn cow_str_into_attr() {
1265 | assert_eq!(
1266 | serde_json::to_string(&Cow::Borrowed("foo").into_attr()).unwrap(),
1267 | r#"{"S":"foo"}"#
1268 | );
1269 | }
1270 |
1271 | #[test]
1272 | fn cow_str_from_attr() {
1273 | assert_eq!(
1274 | Attribute::from_attr(serde_json::from_str::(r#"{"S":"foo"}"#).unwrap()),
1275 | Ok(Cow::Borrowed("foo"))
1276 | );
1277 | }
1278 |
1279 | #[test]
1280 | fn byte_vec_into_attr() {
1281 | assert_eq!(
1282 | serde_json::to_string(&b"foo".to_vec().into_attr()).unwrap(),
1283 | r#"{"B":"Zm9v"}"# // ruosoto converts to base64 for us
1284 | );
1285 | }
1286 |
1287 | #[test]
1288 | fn byte_vec_from_attr() {
1289 | // ruosoto converts to base64 for us
1290 | assert_eq!(
1291 | Attribute::from_attr(
1292 | serde_json::from_str::(r#"{"B":"Zm9v"}"#).unwrap()
1293 | ),
1294 | Ok(b"foo".to_vec())
1295 | );
1296 | }
1297 |
1298 | #[test]
1299 | fn bytes_into_attr() {
1300 | assert_eq!(
1301 | serde_json::to_string(&Bytes::from("foo").into_attr()).unwrap(),
1302 | r#"{"B":"Zm9v"}"# // ruosoto converts to base64 for us
1303 | );
1304 | }
1305 |
1306 | #[test]
1307 | fn bytes_from_attr() {
1308 | assert_eq!(
1309 | Attribute::from_attr(
1310 | serde_json::from_str::(r#"{"B":"Zm9v"}"#).unwrap()
1311 | ),
1312 | Ok(Bytes::from("foo"))
1313 | );
1314 | }
1315 |
1316 | #[test]
1317 | fn numeric_set_into_attr() {
1318 | assert_eq!(
1319 | serde_json::to_string(&btreeset! { 1,2,3 }.into_attr()).unwrap(),
1320 | r#"{"NS":["1","2","3"]}"#
1321 | );
1322 | }
1323 |
1324 | #[test]
1325 | fn numeric_set_from_attr() {
1326 | assert_eq!(
1327 | Attribute::from_attr(
1328 | serde_json::from_str::(r#"{"NS":["1","2","3"]}"#).unwrap()
1329 | ),
1330 | Ok(btreeset! { 1,2,3 })
1331 | );
1332 | }
1333 |
1334 | #[test]
1335 | fn numeric_vec_into_attr() {
1336 | assert_eq!(
1337 | serde_json::to_string(&vec![1, 2, 3, 3].into_attr()).unwrap(),
1338 | r#"{"L":[{"N":"1"},{"N":"2"},{"N":"3"},{"N":"3"}]}"#
1339 | );
1340 | }
1341 |
1342 | #[test]
1343 | fn numeric_vec_from_attr() {
1344 | assert_eq!(
1345 | Attribute::from_attr(
1346 | serde_json::from_str::(
1347 | r#"{"L":[{"N":"1"},{"N":"2"},{"N":"3"},{"N":"3"}]}"#
1348 | )
1349 | .unwrap()
1350 | ),
1351 | Ok(vec![1, 2, 3, 3])
1352 | );
1353 | }
1354 |
1355 | #[test]
1356 | fn string_set_into_attr() {
1357 | assert_eq!(
1358 | serde_json::to_string(
1359 | &btreeset! { "a".to_string(), "b".to_string(), "c".to_string() }.into_attr()
1360 | )
1361 | .unwrap(),
1362 | r#"{"SS":["a","b","c"]}"#
1363 | );
1364 | }
1365 |
1366 | #[test]
1367 | fn string_set_from_attr() {
1368 | assert_eq!(
1369 | Attribute::from_attr(
1370 | serde_json::from_str::(r#"{"SS":["a","b","c"]}"#).unwrap()
1371 | ),
1372 | Ok(btreeset! { "a".to_string(), "b".to_string(), "c".to_string() })
1373 | );
1374 | }
1375 |
1376 | #[test]
1377 | fn string_vec_into_attr() {
1378 | assert_eq!(
1379 | serde_json::to_string(
1380 | &vec! { "a".to_string(), "b".to_string(), "c".to_string() }.into_attr()
1381 | )
1382 | .unwrap(),
1383 | r#"{"L":[{"S":"a"},{"S":"b"},{"S":"c"}]}"#
1384 | );
1385 | }
1386 |
1387 | #[test]
1388 | fn string_vec_from_attr() {
1389 | assert_eq!(
1390 | Attribute::from_attr(
1391 | serde_json::from_str::(r#"{"L":[{"S":"a"},{"S":"b"},{"S":"c"}]}"#)
1392 | .unwrap()
1393 | ),
1394 | Ok(vec! { "a".to_string(), "b".to_string(), "c".to_string() })
1395 | );
1396 | }
1397 |
1398 | #[test]
1399 | fn hashmap_into_attr() {
1400 | assert_eq!(
1401 | serde_json::to_string(&hashmap! { "foo".to_string() => 1 }.into_attr()).unwrap(),
1402 | r#"{"M":{"foo":{"N":"1"}}}"#
1403 | );
1404 | }
1405 |
1406 | #[test]
1407 | fn hashmap_from_attr() {
1408 | assert_eq!(
1409 | Attribute::from_attr(
1410 | serde_json::from_str::(r#"{"M":{"foo":{"N":"1"}}}"#).unwrap()
1411 | ),
1412 | Ok(hashmap! { "foo".to_string() => 1 })
1413 | );
1414 | }
1415 |
1416 | #[test]
1417 | fn btreemap_into_attr() {
1418 | assert_eq!(
1419 | serde_json::to_string(&btreemap! { "foo".to_string() => 1 }.into_attr()).unwrap(),
1420 | r#"{"M":{"foo":{"N":"1"}}}"#
1421 | );
1422 | }
1423 |
1424 | #[test]
1425 | fn btreemap_from_attr() {
1426 | assert_eq!(
1427 | Attribute::from_attr(
1428 | serde_json::from_str::(r#"{"M":{"foo":{"N":"1"}}}"#).unwrap()
1429 | ),
1430 | Ok(btreemap! { "foo".to_string() => 1 })
1431 | );
1432 | }
1433 | }
1434 |
--------------------------------------------------------------------------------
/dynomite/src/retry.rs:
--------------------------------------------------------------------------------
1 | //! Retry functionality
2 | //!
3 | //! Specifically this implementation focuses on honoring [these documented DynamoDB retryable errors](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Programming.Errors.html#Programming.Errors.MessagesAndCodes)
4 | //! on top AWS's general recommendations of for [retrying API requests](https://docs.aws.amazon.com/general/latest/gr/api-retries.html).
5 | //!
6 | //! # examples
7 | //! ```rust,no_run
8 | //! use dynomite::{Retries, retry::Policy};
9 | //! use dynomite::dynamodb::{DynamoDb, DynamoDbClient};
10 | //!
11 | //! let client =
12 | //! DynamoDbClient::new(Default::default())
13 | //! .with_retries(Policy::default());
14 | //!
15 | //! // any client operation will now be retried when
16 | //! // appropriate
17 | //! let tables = client.list_tables(Default::default());
18 | //! ```
19 |
20 | use crate::dynamodb::*;
21 | use again::{Condition, RetryPolicy};
22 | use log::debug;
23 | use rusoto_core::RusotoError;
24 | use std::{sync::Arc, time::Duration};
25 |
26 | /// Pre-configured retry policies for fallible operations
27 | ///
28 | /// A `Default` impl of retrying 5 times with an exponential backoff of 100 milliseconds
29 | #[derive(Clone, PartialEq, Debug)]
30 | pub enum Policy {
31 | /// Limited number of times to retry
32 | Limit(usize),
33 | /// Limited number of times to retry with fixed pause between retries
34 | Pause(usize, Duration),
35 | /// Limited number of times to retry with an exponential pause between retries
36 | Exponential(usize, Duration),
37 | }
38 |
39 | impl Default for Policy {
40 | fn default() -> Self {
41 | Policy::Exponential(5, Duration::from_millis(100))
42 | }
43 | }
44 |
45 | impl From for RetryPolicy {
46 | fn from(policy: Policy) -> RetryPolicy {
47 | match policy {
48 | Policy::Limit(times) => RetryPolicy::default()
49 | .with_max_retries(times)
50 | .with_jitter(true),
51 | Policy::Pause(times, duration) => RetryPolicy::fixed(duration)
52 | .with_max_retries(times)
53 | .with_jitter(true),
54 | Policy::Exponential(times, duration) => RetryPolicy::exponential(duration)
55 | .with_max_retries(times)
56 | .with_jitter(true),
57 | }
58 | }
59 | }
60 |
61 | /// Predicate trait that determines if an impl
62 | /// type is retryable
63 | trait Retry {
64 | /// Return true if type is retryable
65 | fn retryable(&self) -> bool;
66 | }
67 |
68 | struct Counter(u16);
69 |
70 | impl Condition> for Counter
71 | where
72 | R: Retry,
73 | {
74 | fn is_retryable(
75 | &mut self,
76 | error: &RusotoError,
77 | ) -> bool {
78 | debug!("retrying operation {}", self.0);
79 | if let Some(value) = self.0.checked_add(1) {
80 | self.0 = value;
81 | }
82 | match error {
83 | RusotoError::Service(e) => e.retryable(),
84 | _ => false,
85 | }
86 | }
87 | }
88 |
89 | // wrapper so we only pay for one arc
90 | struct Inner {
91 | client: D,
92 | policy: RetryPolicy,
93 | }
94 |
95 | /// A type which implements `DynamoDb` and retries all operations
96 | /// that are retryable
97 | #[derive(Clone)]
98 | pub struct RetryingDynamoDb {
99 | inner: Arc>,
100 | }
101 |
102 | /// An interface for adapting a `DynamoDb` impl
103 | /// to a `RetryingDynamoDb` impl
104 | pub trait Retries
105 | where
106 | D: DynamoDb + 'static,
107 | {
108 | /// Consumes a `DynamoDb` impl and produces
109 | /// a `DynamoDb` which retries its operations when appropriate
110 | fn with_retries(
111 | self,
112 | policy: Policy,
113 | ) -> RetryingDynamoDb;
114 | }
115 |
116 | impl Retries for D
117 | where
118 | D: DynamoDb + 'static,
119 | {
120 | fn with_retries(
121 | self,
122 | policy: Policy,
123 | ) -> RetryingDynamoDb {
124 | RetryingDynamoDb::new(self, policy)
125 | }
126 | }
127 |
128 | impl RetryingDynamoDb
129 | where
130 | D: DynamoDb + 'static,
131 | {
132 | /// Return a new instance with a configured retry policy
133 | pub fn new(
134 | client: D,
135 | policy: Policy,
136 | ) -> Self {
137 | Self {
138 | inner: Arc::new(Inner {
139 | client,
140 | policy: policy.into(),
141 | }),
142 | }
143 | }
144 | }
145 |
146 | #[async_trait::async_trait]
147 | impl DynamoDb for RetryingDynamoDb
148 | where
149 | D: DynamoDb + Sync + Send + Clone + 'static,
150 | {
151 | async fn batch_get_item(
152 | &self,
153 | input: BatchGetItemInput,
154 | ) -> Result> {
155 | self.inner
156 | .policy
157 | .retry_if(
158 | move || {
159 | let client = self.inner.clone().client.clone();
160 | let input = input.clone();
161 | async move { client.batch_get_item(input).await }
162 | },
163 | Counter(0),
164 | )
165 | .await
166 | }
167 |
168 | async fn batch_write_item(
169 | &self,
170 | input: BatchWriteItemInput,
171 | ) -> Result> {
172 | self.inner
173 | .policy
174 | .retry_if(
175 | move || {
176 | let client = self.inner.clone().client.clone();
177 | let input = input.clone();
178 | async move { client.batch_write_item(input).await }
179 | },
180 | Counter(0),
181 | )
182 | .await
183 | }
184 |
185 | async fn create_backup(
186 | &self,
187 | input: CreateBackupInput,
188 | ) -> Result> {
189 | self.inner
190 | .policy
191 | .retry_if(
192 | move || {
193 | let client = self.inner.clone().client.clone();
194 | let input = input.clone();
195 | async move { client.create_backup(input).await }
196 | },
197 | Counter(0),
198 | )
199 | .await
200 | }
201 |
202 | async fn create_global_table(
203 | &self,
204 | input: CreateGlobalTableInput,
205 | ) -> Result> {
206 | self.inner
207 | .policy
208 | .retry_if(
209 | move || {
210 | let client = self.inner.clone().client.clone();
211 | let input = input.clone();
212 | async move { client.create_global_table(input).await }
213 | },
214 | Counter(0),
215 | )
216 | .await
217 | }
218 |
219 | async fn create_table(
220 | &self,
221 | input: CreateTableInput,
222 | ) -> Result> {
223 | self.inner
224 | .policy
225 | .retry_if(
226 | move || {
227 | let client = self.inner.clone().client.clone();
228 | let input = input.clone();
229 | async move { client.create_table(input).await }
230 | },
231 | Counter(0),
232 | )
233 | .await
234 | }
235 |
236 | async fn delete_backup(
237 | &self,
238 | input: DeleteBackupInput,
239 | ) -> Result> {
240 | self.inner
241 | .policy
242 | .retry_if(
243 | move || {
244 | let client = self.inner.clone().client.clone();
245 | let input = input.clone();
246 | async move { client.delete_backup(input).await }
247 | },
248 | Counter(0),
249 | )
250 | .await
251 | }
252 |
253 | async fn delete_item(
254 | &self,
255 | input: DeleteItemInput,
256 | ) -> Result> {
257 | self.inner
258 | .policy
259 | .retry_if(
260 | move || {
261 | let client = self.inner.clone().client.clone();
262 | let input = input.clone();
263 | async move { client.delete_item(input).await }
264 | },
265 | Counter(0),
266 | )
267 | .await
268 | }
269 |
270 | async fn delete_table(
271 | &self,
272 | input: DeleteTableInput,
273 | ) -> Result> {
274 | self.inner
275 | .policy
276 | .retry_if(
277 | move || {
278 | let client = self.inner.clone().client.clone();
279 | let input = input.clone();
280 | async move { client.delete_table(input).await }
281 | },
282 | Counter(0),
283 | )
284 | .await
285 | }
286 |
287 | async fn describe_backup(
288 | &self,
289 | input: DescribeBackupInput,
290 | ) -> Result> {
291 | self.inner
292 | .policy
293 | .retry_if(
294 | move || {
295 | let client = self.inner.clone().client.clone();
296 | let input = input.clone();
297 | async move { client.describe_backup(input).await }
298 | },
299 | Counter(0),
300 | )
301 | .await
302 | }
303 |
304 | async fn describe_export(
305 | &self,
306 | input: DescribeExportInput,
307 | ) -> Result> {
308 | self.inner.client.describe_export(input).await
309 | }
310 |
311 | async fn describe_continuous_backups(
312 | &self,
313 | input: DescribeContinuousBackupsInput,
314 | ) -> Result> {
315 | self.inner
316 | .policy
317 | .retry_if(
318 | move || {
319 | let client = self.inner.clone().client.clone();
320 | let input = input.clone();
321 | async move { client.describe_continuous_backups(input).await }
322 | },
323 | Counter(0),
324 | )
325 | .await
326 | }
327 |
328 | async fn describe_contributor_insights(
329 | &self,
330 | input: DescribeContributorInsightsInput,
331 | ) -> Result>
332 | {
333 | self.inner.client.describe_contributor_insights(input).await
334 | }
335 |
336 | async fn describe_global_table(
337 | &self,
338 | input: DescribeGlobalTableInput,
339 | ) -> Result> {
340 | self.inner
341 | .policy
342 | .retry_if(
343 | move || {
344 | let client = self.inner.clone().client.clone();
345 | let input = input.clone();
346 | async move { client.describe_global_table(input).await }
347 | },
348 | Counter(0),
349 | )
350 | .await
351 | }
352 |
353 | async fn describe_global_table_settings(
354 | &self,
355 | input: DescribeGlobalTableSettingsInput,
356 | ) -> Result>
357 | {
358 | self.inner
359 | .policy
360 | .retry_if(
361 | move || {
362 | let client = self.inner.clone().client.clone();
363 | let input = input.clone();
364 | async move { client.describe_global_table_settings(input).await }
365 | },
366 | Counter(0),
367 | )
368 | .await
369 | }
370 |
371 | async fn describe_limits(
372 | &self
373 | ) -> Result> {
374 | self.inner
375 | .policy
376 | .retry_if(
377 | move || {
378 | let client = self.inner.clone().client.clone();
379 | async move { client.describe_limits().await }
380 | },
381 | Counter(0),
382 | )
383 | .await
384 | }
385 |
386 | async fn describe_table(
387 | &self,
388 | input: DescribeTableInput,
389 | ) -> Result> {
390 | self.inner
391 | .policy
392 | .retry_if(
393 | move || {
394 | let client = self.inner.clone().client.clone();
395 | let input = input.clone();
396 | async move { client.describe_table(input).await }
397 | },
398 | Counter(0),
399 | )
400 | .await
401 | }
402 |
403 | async fn describe_table_replica_auto_scaling(
404 | &self,
405 | input: DescribeTableReplicaAutoScalingInput,
406 | ) -> Result<
407 | DescribeTableReplicaAutoScalingOutput,
408 | RusotoError,
409 | > {
410 | self.inner
411 | .client
412 | .describe_table_replica_auto_scaling(input)
413 | .await
414 | }
415 |
416 | async fn describe_time_to_live(
417 | &self,
418 | input: DescribeTimeToLiveInput,
419 | ) -> Result> {
420 | self.inner
421 | .policy
422 | .retry_if(
423 | move || {
424 | let client = self.inner.clone().client.clone();
425 | let input = input.clone();
426 | async move { client.describe_time_to_live(input).await }
427 | },
428 | Counter(0),
429 | )
430 | .await
431 | }
432 |
433 | async fn get_item(
434 | &self,
435 | input: GetItemInput,
436 | ) -> Result> {
437 | self.inner
438 | .policy
439 | .retry_if(
440 | move || {
441 | let client = self.inner.clone().client.clone();
442 | let input = input.clone();
443 | async move { client.get_item(input).await }
444 | },
445 | Counter(0),
446 | )
447 | .await
448 | }
449 |
450 | async fn list_backups(
451 | &self,
452 | input: ListBackupsInput,
453 | ) -> Result> {
454 | self.inner
455 | .policy
456 | .retry_if(
457 | move || {
458 | let client = self.inner.clone().client.clone();
459 | let input = input.clone();
460 | async move { client.list_backups(input).await }
461 | },
462 | Counter(0),
463 | )
464 | .await
465 | }
466 |
467 | async fn list_exports(
468 | &self,
469 | input: ListExportsInput,
470 | ) -> Result> {
471 | self.inner.client.list_exports(input).await
472 | }
473 |
474 | async fn list_contributor_insights(
475 | &self,
476 | input: ListContributorInsightsInput,
477 | ) -> Result> {
478 | self.inner.client.list_contributor_insights(input).await
479 | }
480 |
481 | async fn list_global_tables(
482 | &self,
483 | input: ListGlobalTablesInput,
484 | ) -> Result> {
485 | self.inner
486 | .policy
487 | .retry_if(
488 | move || {
489 | let client = self.inner.clone().client.clone();
490 | let input = input.clone();
491 | async move { client.list_global_tables(input).await }
492 | },
493 | Counter(0),
494 | )
495 | .await
496 | }
497 |
498 | async fn list_tables(
499 | &self,
500 | input: ListTablesInput,
501 | ) -> Result> {
502 | self.inner
503 | .policy
504 | .retry_if(
505 | move || {
506 | let client = self.inner.clone().client.clone();
507 | let input = input.clone();
508 | async move { client.list_tables(input).await }
509 | },
510 | Counter(0),
511 | )
512 | .await
513 | }
514 |
515 | async fn list_tags_of_resource(
516 | &self,
517 | input: ListTagsOfResourceInput,
518 | ) -> Result> {
519 | self.inner
520 | .policy
521 | .retry_if(
522 | move || {
523 | let client = self.inner.clone().client.clone();
524 | let input = input.clone();
525 | async move { client.list_tags_of_resource(input).await }
526 | },
527 | Counter(0),
528 | )
529 | .await
530 | }
531 |
532 | async fn put_item(
533 | &self,
534 | input: PutItemInput,
535 | ) -> Result> {
536 | self.inner
537 | .policy
538 | .retry_if(
539 | move || {
540 | let client = self.inner.clone().client.clone();
541 | let input = input.clone();
542 | async move { client.put_item(input).await }
543 | },
544 | Counter(0),
545 | )
546 | .await
547 | }
548 |
549 | async fn query(
550 | &self,
551 | input: QueryInput,
552 | ) -> Result> {
553 | self.inner
554 | .policy
555 | .retry_if(
556 | move || {
557 | let client = self.inner.clone().client.clone();
558 | let input = input.clone();
559 | async move { client.query(input).await }
560 | },
561 | Counter(0),
562 | )
563 | .await
564 | }
565 |
566 | async fn restore_table_from_backup(
567 | &self,
568 | input: RestoreTableFromBackupInput,
569 | ) -> Result> {
570 | self.inner
571 | .policy
572 | .retry_if(
573 | move || {
574 | let client = self.inner.clone().client.clone();
575 | let input = input.clone();
576 | async move { client.restore_table_from_backup(input).await }
577 | },
578 | Counter(0),
579 | )
580 | .await
581 | }
582 |
583 | async fn restore_table_to_point_in_time(
584 | &self,
585 | input: RestoreTableToPointInTimeInput,
586 | ) -> Result> {
587 | self.inner
588 | .policy
589 | .retry_if(
590 | move || {
591 | let client = self.inner.clone().client.clone();
592 | let input = input.clone();
593 | async move { client.restore_table_to_point_in_time(input).await }
594 | },
595 | Counter(0),
596 | )
597 | .await
598 | }
599 |
600 | async fn scan(
601 | &self,
602 | input: ScanInput,
603 | ) -> Result> {
604 | self.inner
605 | .policy
606 | .retry_if(
607 | move || {
608 | let client = self.inner.clone().client.clone();
609 | let input = input.clone();
610 | async move { client.scan(input).await }
611 | },
612 | Counter(0),
613 | )
614 | .await
615 | }
616 |
617 | async fn tag_resource(
618 | &self,
619 | input: TagResourceInput,
620 | ) -> Result<(), RusotoError> {
621 | self.inner
622 | .policy
623 | .retry_if(
624 | move || {
625 | let client = self.inner.clone().client.clone();
626 | let input = input.clone();
627 | async move { client.tag_resource(input).await }
628 | },
629 | Counter(0),
630 | )
631 | .await
632 | }
633 |
634 | async fn untag_resource(
635 | &self,
636 | input: UntagResourceInput,
637 | ) -> Result<(), RusotoError> {
638 | self.inner
639 | .policy
640 | .retry_if(
641 | move || {
642 | let client = self.inner.clone().client.clone();
643 | let input = input.clone();
644 | async move { client.untag_resource(input).await }
645 | },
646 | Counter(0),
647 | )
648 | .await
649 | }
650 |
651 | async fn update_continuous_backups(
652 | &self,
653 | input: UpdateContinuousBackupsInput,
654 | ) -> Result> {
655 | self.inner
656 | .policy
657 | .retry_if(
658 | move || {
659 | let client = self.inner.clone().client.clone();
660 | let input = input.clone();
661 | async move { client.update_continuous_backups(input).await }
662 | },
663 | Counter(0),
664 | )
665 | .await
666 | }
667 |
668 | async fn update_contributor_insights(
669 | &self,
670 | input: UpdateContributorInsightsInput,
671 | ) -> Result> {
672 | // todo: retry
673 | self.inner
674 | .clone()
675 | .client
676 | .update_contributor_insights(input)
677 | .await
678 | }
679 |
680 | async fn update_global_table(
681 | &self,
682 | input: UpdateGlobalTableInput,
683 | ) -> Result> {
684 | self.inner
685 | .policy
686 | .retry_if(
687 | move || {
688 | let client = self.inner.clone().client.clone();
689 | let input = input.clone();
690 | async move { client.update_global_table(input).await }
691 | },
692 | Counter(0),
693 | )
694 | .await
695 | }
696 |
697 | async fn update_global_table_settings(
698 | &self,
699 | input: UpdateGlobalTableSettingsInput,
700 | ) -> Result> {
701 | self.inner
702 | .policy
703 | .retry_if(
704 | move || {
705 | let client = self.inner.clone().client.clone();
706 | let input = input.clone();
707 | async move { client.update_global_table_settings(input).await }
708 | },
709 | Counter(0),
710 | )
711 | .await
712 | }
713 |
714 | async fn update_item(
715 | &self,
716 | input: UpdateItemInput,
717 | ) -> Result> {
718 | self.inner
719 | .policy
720 | .retry_if(
721 | move || {
722 | let client = self.inner.clone().client.clone();
723 | let input = input.clone();
724 | async move { client.update_item(input).await }
725 | },
726 | Counter(0),
727 | )
728 | .await
729 | }
730 |
731 | async fn update_table(
732 | &self,
733 | input: UpdateTableInput,
734 | ) -> Result> {
735 | self.inner
736 | .policy
737 | .retry_if(
738 | move || {
739 | let client = self.inner.clone().client.clone();
740 | let input = input.clone();
741 | async move { client.update_table(input).await }
742 | },
743 | Counter(0),
744 | )
745 | .await
746 | }
747 |
748 | async fn update_table_replica_auto_scaling(
749 | &self,
750 | input: UpdateTableReplicaAutoScalingInput,
751 | ) -> Result>
752 | {
753 | self.inner
754 | .client
755 | .update_table_replica_auto_scaling(input)
756 | .await
757 | }
758 |
759 | async fn update_time_to_live(
760 | &self,
761 | input: UpdateTimeToLiveInput,
762 | ) -> Result> {
763 | self.inner
764 | .policy
765 | .retry_if(
766 | move || {
767 | let client = self.inner.clone().client.clone();
768 | let input = input.clone();
769 | async move { client.update_time_to_live(input).await }
770 | },
771 | Counter(0),
772 | )
773 | .await
774 | }
775 |
776 | async fn describe_endpoints(
777 | &self
778 | ) -> Result> {
779 | // no apparent retryable errors
780 | self.inner.client.describe_endpoints().await
781 | }
782 |
783 | async fn transact_get_items(
784 | &self,
785 | input: TransactGetItemsInput,
786 | ) -> Result> {
787 | self.inner
788 | .policy
789 | .retry_if(
790 | move || {
791 | let client = self.inner.clone().client.clone();
792 | let input = input.clone();
793 | async move { client.transact_get_items(input).await }
794 | },
795 | Counter(0),
796 | )
797 | .await
798 | }
799 |
800 | async fn transact_write_items(
801 | &self,
802 | input: TransactWriteItemsInput,
803 | ) -> Result> {
804 | self.inner
805 | .policy
806 | .retry_if(
807 | move || {
808 | let client = self.inner.clone().client.clone();
809 | let input = input.clone();
810 | async move { client.transact_write_items(input).await }
811 | },
812 | Counter(0),
813 | )
814 | .await
815 | }
816 |
817 | async fn batch_execute_statement(
818 | &self,
819 | input: BatchExecuteStatementInput,
820 | ) -> Result> {
821 | self.inner.client.batch_execute_statement(input).await
822 | }
823 |
824 | async fn execute_statement(
825 | &self,
826 | input: ExecuteStatementInput,
827 | ) -> Result> {
828 | self.inner.client.execute_statement(input).await
829 | }
830 |
831 | async fn execute_transaction(
832 | &self,
833 | input: ExecuteTransactionInput,
834 | ) -> Result> {
835 | self.inner.client.execute_transaction(input).await
836 | }
837 |
838 | async fn describe_kinesis_streaming_destination(
839 | &self,
840 | input: DescribeKinesisStreamingDestinationInput,
841 | ) -> Result<
842 | DescribeKinesisStreamingDestinationOutput,
843 | RusotoError,
844 | > {
845 | self.inner
846 | .client
847 | .describe_kinesis_streaming_destination(input)
848 | .await
849 | }
850 |
851 | async fn enable_kinesis_streaming_destination(
852 | &self,
853 | input: KinesisStreamingDestinationInput,
854 | ) -> Result<
855 | KinesisStreamingDestinationOutput,
856 | RusotoError,
857 | > {
858 | self.inner
859 | .client
860 | .enable_kinesis_streaming_destination(input)
861 | .await
862 | }
863 |
864 | async fn disable_kinesis_streaming_destination(
865 | &self,
866 | input: KinesisStreamingDestinationInput,
867 | ) -> Result<
868 | KinesisStreamingDestinationOutput,
869 | RusotoError,
870 | > {
871 | self.inner
872 | .client
873 | .disable_kinesis_streaming_destination(input)
874 | .await
875 | }
876 |
877 | async fn export_table_to_point_in_time(
878 | &self,
879 | input: ExportTableToPointInTimeInput,
880 | ) -> Result> {
881 | self.inner.client.export_table_to_point_in_time(input).await
882 | }
883 | }
884 |
885 | /// retry impl for Service error types
886 | macro_rules! retry {
887 | ($e:ty, $($p: pat)+) => {
888 | impl Retry for $e {
889 | fn retryable(&self) -> bool {
890 | // we allow unreachable_patterns because
891 | // _ => false because in some cases
892 | // all variants are retryable
893 | // in other cases, only a subset, hence
894 | // this type matching
895 | #[allow(unreachable_patterns)]
896 | match self {
897 | $($p)|+ => true,
898 | _ => false
899 | }
900 | }
901 | }
902 | }
903 | }
904 |
905 | retry!(
906 | BatchGetItemError,
907 | BatchGetItemError::InternalServerError(_) BatchGetItemError::ProvisionedThroughputExceeded(_)
908 | );
909 |
910 | retry!(
911 | BatchWriteItemError,
912 | BatchWriteItemError::InternalServerError(_) BatchWriteItemError::ProvisionedThroughputExceeded(_)
913 | );
914 |
915 | retry!(
916 | CreateBackupError,
917 | CreateBackupError::InternalServerError(_) CreateBackupError::LimitExceeded(_)
918 | );
919 |
920 | retry!(
921 | CreateGlobalTableError,
922 | CreateGlobalTableError::InternalServerError(_) CreateGlobalTableError::LimitExceeded(_)
923 | );
924 |
925 | retry!(
926 | CreateTableError,
927 | CreateTableError::InternalServerError(_) CreateTableError::LimitExceeded(_)
928 | );
929 |
930 | retry!(
931 | DeleteBackupError,
932 | DeleteBackupError::InternalServerError(_) DeleteBackupError::LimitExceeded(_)
933 | );
934 |
935 | retry!(
936 | DeleteItemError,
937 | DeleteItemError::InternalServerError(_) DeleteItemError::ProvisionedThroughputExceeded(_)
938 | );
939 |
940 | retry!(
941 | DeleteTableError,
942 | DeleteTableError::InternalServerError(_) DeleteTableError::LimitExceeded(_)
943 | );
944 |
945 | retry!(
946 | DescribeBackupError,
947 | DescribeBackupError::InternalServerError(_)
948 | );
949 |
950 | retry!(
951 | DescribeContinuousBackupsError,
952 | DescribeContinuousBackupsError::InternalServerError(_)
953 | );
954 |
955 | retry!(
956 | DescribeGlobalTableError,
957 | DescribeGlobalTableError::InternalServerError(_)
958 | );
959 |
960 | retry!(
961 | DescribeGlobalTableSettingsError,
962 | DescribeGlobalTableSettingsError::InternalServerError(_)
963 | );
964 |
965 | retry!(
966 | DescribeLimitsError,
967 | DescribeLimitsError::InternalServerError(_)
968 | );
969 |
970 | retry!(
971 | DescribeTableError,
972 | DescribeTableError::InternalServerError(_)
973 | );
974 |
975 | retry!(
976 | GetItemError,
977 | GetItemError::InternalServerError(_) GetItemError::ProvisionedThroughputExceeded(_)
978 | );
979 |
980 | retry!(ListBackupsError, ListBackupsError::InternalServerError(_));
981 |
982 | retry!(ListTablesError, ListTablesError::InternalServerError(_));
983 |
984 | retry!(
985 | ListTagsOfResourceError,
986 | ListTagsOfResourceError::InternalServerError(_)
987 | );
988 |
989 | retry!(
990 | PutItemError,
991 | PutItemError::InternalServerError(_) PutItemError::ProvisionedThroughputExceeded(_)
992 | );
993 |
994 | retry!(
995 | QueryError,
996 | QueryError::InternalServerError(_) QueryError::ProvisionedThroughputExceeded(_)
997 | );
998 |
999 | retry!(
1000 | RestoreTableFromBackupError,
1001 | RestoreTableFromBackupError::InternalServerError(_)
1002 | );
1003 |
1004 | retry!(
1005 | RestoreTableToPointInTimeError,
1006 | RestoreTableToPointInTimeError::InternalServerError(_)
1007 | );
1008 |
1009 | retry!(
1010 | ScanError,
1011 | ScanError::InternalServerError(_) ScanError::ProvisionedThroughputExceeded(_)
1012 | );
1013 |
1014 | retry!(
1015 | TagResourceError,
1016 | TagResourceError::InternalServerError(_) TagResourceError::LimitExceeded(_)
1017 | );
1018 |
1019 | retry!(
1020 | UntagResourceError,
1021 | UntagResourceError::InternalServerError(_) UntagResourceError::LimitExceeded(_)
1022 | );
1023 |
1024 | retry!(
1025 | UpdateContinuousBackupsError,
1026 | UpdateContinuousBackupsError::InternalServerError(_)
1027 | );
1028 |
1029 | retry!(
1030 | UpdateGlobalTableError,
1031 | UpdateGlobalTableError::InternalServerError(_)
1032 | );
1033 |
1034 | retry!(
1035 | UpdateGlobalTableSettingsError,
1036 | UpdateGlobalTableSettingsError::InternalServerError(_)
1037 | );
1038 |
1039 | retry!(
1040 | UpdateItemError,
1041 | UpdateItemError::InternalServerError(_) UpdateItemError::ProvisionedThroughputExceeded(_)
1042 | );
1043 |
1044 | retry!(
1045 | UpdateTableError,
1046 | UpdateTableError::InternalServerError(_) UpdateTableError::LimitExceeded(_)
1047 | );
1048 |
1049 | retry!(
1050 | UpdateTimeToLiveError,
1051 | UpdateTimeToLiveError::InternalServerError(_) UpdateTimeToLiveError::LimitExceeded(_)
1052 | );
1053 |
1054 | retry!(
1055 | ListGlobalTablesError,
1056 | ListGlobalTablesError::InternalServerError(_)
1057 | );
1058 |
1059 | retry!(
1060 | DescribeTimeToLiveError,
1061 | DescribeTimeToLiveError::InternalServerError(_)
1062 | );
1063 |
1064 | retry!(
1065 | TransactGetItemsError,
1066 | TransactGetItemsError::InternalServerError(_) TransactGetItemsError::ProvisionedThroughputExceeded(_)
1067 | );
1068 |
1069 | retry!(
1070 | TransactWriteItemsError,
1071 | TransactWriteItemsError::InternalServerError(_) TransactWriteItemsError::ProvisionedThroughputExceeded(_)
1072 | );
1073 |
1074 | #[cfg(test)]
1075 | mod tests {
1076 | use super::*;
1077 | #[test]
1078 | fn policy_has_default() {
1079 | assert_eq!(
1080 | Policy::default(),
1081 | Policy::Exponential(5, Duration::from_millis(100))
1082 | );
1083 | }
1084 |
1085 | #[test]
1086 | fn policy_impl_into_for_retry_policy() {
1087 | fn test(_: impl Into