├── .gitignore
├── .travis.yml
├── Gemfile
├── LICENSE.txt
├── README.md
├── README_mysql.md
├── Rakefile
├── fluent-plugin-mysql.gemspec
├── lib
└── fluent
│ └── plugin
│ ├── out_mysql.rb
│ └── out_mysql_bulk.rb
└── test
├── helper.rb
└── plugin
├── test_out_mysql.rb
└── test_out_mysql_bulk.rb
/.gitignore:
--------------------------------------------------------------------------------
1 | *.gem
2 | *.rbc
3 | .bundle
4 | .config
5 | .yardoc
6 | Gemfile.lock
7 | InstalledFiles
8 | _yardoc
9 | coverage
10 | doc/
11 | lib/bundler/man
12 | pkg
13 | rdoc
14 | spec/reports
15 | test/tmp
16 | test/version_tmp
17 | tmp
18 | test.conf
19 |
--------------------------------------------------------------------------------
/.travis.yml:
--------------------------------------------------------------------------------
1 | language: ruby
2 |
3 | rvm:
4 | - 2.1
5 | - 2.2.3
6 | - 2.3.0
7 |
8 | gemfile:
9 | - Gemfile
10 |
11 | before_install:
12 | - gem update bundler
13 |
14 | script: 'bundle exec rake test'
15 |
--------------------------------------------------------------------------------
/Gemfile:
--------------------------------------------------------------------------------
1 | source 'https://rubygems.org'
2 |
3 | # Specify your gem's dependencies in fluent-plugin-mysql.gemspec
4 | gemspec
5 |
--------------------------------------------------------------------------------
/LICENSE.txt:
--------------------------------------------------------------------------------
1 | Copyright (c) 2012- TAGOMORI Satoshi
2 |
3 | Licensed under the Apache License, Version 2.0 (the "License");
4 | you may not use this file except in compliance with the License.
5 | You may obtain a copy of the License at
6 |
7 | http://www.apache.org/licenses/LICENSE-2.0
8 |
9 | Unless required by applicable law or agreed to in writing, software
10 | distributed under the License is distributed on an "AS IS" BASIS,
11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | See the License for the specific language governing permissions and
13 | limitations under the License.
14 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 | # fluent-plugin-mysql, a plugin for [Fluentd](http://fluentd.org) [](http://travis-ci.org/tagomoris/fluent-plugin-mysql)
3 |
4 | fluent plugin mysql bulk insert is high performance and on duplicate key update respond.
5 |
6 | ## Note
7 | fluent-plugin-mysql-bulk merged this repository.
8 |
9 | [mysql plugin](README_mysql.md) is deprecated. You should use mysql_bulk.
10 |
11 | v0.1.5 only supports fluentd-0.12.X and v0.2.0 only supports fluentd-0.14.X.
12 |
13 | ## Parameters
14 |
15 | param|value
16 | --------|------
17 | host|database host(default: 127.0.0.1)
18 | port|database port(default: 3306)
19 | database|database name(require)
20 | username|user(require)
21 | password|password(default: blank)
22 | sslkey|path to client key(default: nil)
23 | sslcert|path to client cert(default: nil)
24 | sslca|path to ca cert(default: nil)
25 | sslcapath|path to ca certs(default: nil)
26 | sslcipher|ssl cipher(default: nil)
27 | sslverify|verify server certificate(default: nil)
28 | column_names|bulk insert column (require)
29 | key_names|value key names, ${time} is placeholder Time.at(time).strftime("%Y-%m-%d %H:%M:%S") (default : column_names)
30 | json_key_names|Key names which store data as json, comma separator.
31 | unixtimestamp_key_names|Key names which store data as datetime from unix time stamp
32 | table|bulk insert table (require)
33 | on_duplicate_key_update|on duplicate key update enable (true:false)
34 | on_duplicate_update_keys|on duplicate key update column, comma separator
35 | transaction_isolation_level|set transaction isolation level(default: nil)
36 |
37 | ## Configuration Example(bulk insert)
38 |
39 | ```
40 |
41 | @type mysql_bulk
42 | host localhost
43 | database test_app_development
44 | username root
45 | password hogehoge
46 | column_names id,user_name,created_at,updated_at
47 | table users
48 | flush_interval 10s
49 |
50 | ```
51 |
52 | Assume following input is coming:
53 |
54 | ```js
55 | mysql.input: {"user_name":"toyama","created_at":"2014/01/03 21:35:15","updated_at":"2014/01/03 21:35:15","dummy":"hogehoge"}
56 | mysql.input: {"user_name":"toyama2","created_at":"2014/01/03 21:35:21","updated_at":"2014/01/03 21:35:21","dummy":"hogehoge"}
57 | mysql.input: {"user_name":"toyama3","created_at":"2014/01/03 21:35:27","updated_at":"2014/01/03 21:35:27","dummy":"hogehoge"}
58 | ```
59 |
60 | then result becomes as below (indented):
61 |
62 | ```sql
63 | +-----+-----------+---------------------+---------------------+
64 | | id | user_name | created_at | updated_at |
65 | +-----+-----------+---------------------+---------------------+
66 | | 1 | toyama | 2014-01-03 21:35:15 | 2014-01-03 21:35:15 |
67 | | 2 | toyama2 | 2014-01-03 21:35:21 | 2014-01-03 21:35:21 |
68 | | 3 | toyama3 | 2014-01-03 21:35:27 | 2014-01-03 21:35:27 |
69 | +-----+-----------+---------------------+---------------------+
70 | ```
71 |
72 | running query
73 |
74 | ```sql
75 | INSERT INTO users (id,user_name,created_at,updated_at) VALUES (NULL,'toyama','2014/01/03 21:35:15','2014/01/03 21:35:15'),(NULL,'toyama2','2014/01/03 21:35:21','2014/01/03 21:35:21')
76 | ```
77 |
78 | ## Configuration Example(bulk insert , if duplicate error record update)
79 |
80 | ```
81 |
82 | @type mysql_bulk
83 | host localhost
84 | database test_app_development
85 | username root
86 | password hogehoge
87 | column_names id,user_name,created_at,updated_at
88 | table users
89 | on_duplicate_key_update true
90 | on_duplicate_update_keys user_name,updated_at
91 | flush_interval 60s
92 |
93 | ```
94 |
95 | Assume following input is coming:
96 |
97 | ```js
98 | mysql.input: {"id":"1" ,"user_name":"toyama7","created_at":"2014/01/03 21:58:03","updated_at":"2014/01/03 21:58:03"}
99 | mysql.input: {"id":"2" ,"user_name":"toyama7","created_at":"2014/01/03 21:58:06","updated_at":"2014/01/03 21:58:06"}
100 | mysql.input: {"id":"3" ,"user_name":"toyama7","created_at":"2014/01/03 21:58:08","updated_at":"2014/01/03 21:58:08"}
101 | mysql.input: {"id":"10","user_name":"toyama7","created_at":"2014/01/03 21:58:18","updated_at":"2014/01/03 21:58:18"}
102 | ```
103 |
104 | then result becomes as below (indented):
105 |
106 | ```sql
107 | +-----+-----------+---------------------+---------------------+
108 | | id | user_name | created_at | updated_at |
109 | +-----+-----------+---------------------+---------------------+
110 | | 1 | toyama7 | 2014-01-03 21:35:15 | 2014-01-03 21:58:03 |
111 | | 2 | toyama7 | 2014-01-03 21:35:21 | 2014-01-03 21:58:06 |
112 | | 3 | toyama7 | 2014-01-03 21:35:27 | 2014-01-03 21:58:08 |
113 | | 10 | toyama7 | 2014-01-03 21:58:18 | 2014-01-03 21:58:18 |
114 | +-----+-----------+---------------------+---------------------+
115 | ```
116 |
117 | if duplicate id , update username and updated_at
118 |
119 |
120 | ## Configuration Example(bulk insert,fluentd key different column name)
121 |
122 | ```
123 |
124 | @type mysql_bulk
125 | host localhost
126 | database test_app_development
127 | username root
128 | password hogehoge
129 | column_names id,user_name,created_at,updated_at
130 | key_names id,user,created_date,updated_date
131 | table users
132 | flush_interval 10s
133 |
134 | ```
135 |
136 | Assume following input is coming:
137 |
138 | ```js
139 | mysql.input: {"user":"toyama","created_date":"2014/01/03 21:35:15","updated_date":"2014/01/03 21:35:15","dummy":"hogehoge"}
140 | mysql.input: {"user":"toyama2","created_date":"2014/01/03 21:35:21","updated_date":"2014/01/03 21:35:21","dummy":"hogehoge"}
141 | mysql.input: {"user":"toyama3","created_date":"2014/01/03 21:35:27","updated_date":"2014/01/03 21:35:27","dummy":"hogehoge"}
142 | ```
143 |
144 | then result becomes as below (indented):
145 |
146 | ```sql
147 | +-----+-----------+---------------------+---------------------+
148 | | id | user_name | created_at | updated_at |
149 | +-----+-----------+---------------------+---------------------+
150 | | 1 | toyama | 2014-01-03 21:35:15 | 2014-01-03 21:35:15 |
151 | | 2 | toyama2 | 2014-01-03 21:35:21 | 2014-01-03 21:35:21 |
152 | | 3 | toyama3 | 2014-01-03 21:35:27 | 2014-01-03 21:35:27 |
153 | +-----+-----------+---------------------+---------------------+
154 | ```
155 |
156 | ## Configuration Example(bulk insert, time complement)
157 |
158 | ```
159 |
160 | @type mysql_bulk
161 | host localhost
162 | database test_app_development
163 | username root
164 | password hogehoge
165 | column_names id,user_name,created_at
166 | key_names id,user,${time}
167 | table users
168 | flush_interval 10s
169 |
170 | ```
171 |
172 | Assume following input is coming:
173 |
174 | ```js
175 | 2014-01-03 21:35:15+09:00: mysql.input: {"user":"toyama","dummy":"hogehoge"}
176 | 2014-01-03 21:35:21+09:00: mysql.input: {"user":"toyama2","dummy":"hogehoge"}
177 | 2014-01-03 21:35:27+09:00: mysql.input: {"user":"toyama3","dummy":"hogehoge"}
178 | ```
179 |
180 | then `created_at` column is set from time attribute in a fluentd packet:
181 |
182 | ```sql
183 | +-----+-----------+---------------------+
184 | | id | user_name | created_at |
185 | +-----+-----------+---------------------+
186 | | 1 | toyama | 2014-01-03 21:35:15 |
187 | | 2 | toyama2 | 2014-01-03 21:35:21 |
188 | | 3 | toyama3 | 2014-01-03 21:35:27 |
189 | +-----+-----------+---------------------+
190 | ```
191 |
192 | ## Configuration Example(bulk insert, time complement with specific timezone)
193 |
194 | As described above, `${time}` placeholder sets time with `Time.at(time).strftime("%Y-%m-%d %H:%M:%S")`.
195 | This handles the time with fluentd server default timezone.
196 | If you want to use the specific timezone, you can use the include_time_key feature.
197 | This is useful in case fluentd server and mysql have different timezone.
198 | You can use various timezone format. See below.
199 | http://docs.fluentd.org/articles/formatter-plugin-overview
200 |
201 | ```
202 |
203 | @type mysql_bulk
204 | host localhost
205 | database test_app_development
206 | username root
207 | password hogehoge
208 |
209 | include_time_key yes
210 | timezone +00
211 | time_format %Y-%m-%d %H:%M:%S
212 | time_key created_at
213 |
214 | column_names id,user_name,created_at
215 | key_names id,user,created_at
216 | table users
217 | flush_interval 10s
218 |
219 | ```
220 |
221 | Assume following input is coming(fluentd server is using JST +09 timezone):
222 |
223 | ```js
224 | 2014-01-03 21:35:15+09:00: mysql.input: {"user":"toyama","dummy":"hogehoge"}
225 | 2014-01-03 21:35:21+09:00: mysql.input: {"user":"toyama2","dummy":"hogehoge"}
226 | 2014-01-03 21:35:27+09:00: mysql.input: {"user":"toyama3","dummy":"hogehoge"}
227 | ```
228 |
229 | then `created_at` column is set from time attribute in a fluentd packet with timezone converted to +00 UTC:
230 |
231 | ```sql
232 | +-----+-----------+---------------------+
233 | | id | user_name | created_at |
234 | +-----+-----------+---------------------+
235 | | 1 | toyama | 2014-01-03 12:35:15 |
236 | | 2 | toyama2 | 2014-01-03 12:35:21 |
237 | | 3 | toyama3 | 2014-01-03 12:35:27 |
238 | +-----+-----------+---------------------+
239 | ```
240 |
241 | ## Configuration Example(bulk insert with tag placeholder for table name)
242 |
243 | This description is for v0.14.X users.
244 |
245 | ```
246 |
247 | @type mysql_bulk
248 | host localhost
249 | database test_app_development
250 | username root
251 | password hogehoge
252 | column_names id,user_name,created_at
253 | key_names id,user,${time}
254 | table users_${tag}
255 |
256 | @type memory
257 | flush_interval 60s
258 |
259 |
260 | ```
261 |
262 | Assume following input is coming:
263 |
264 | ```js
265 | 2016-09-26 18:42:13+09:00: mysql.input: {"user":"toyama","dummy":"hogehoge"}
266 | 2016-09-26 18:42:16+09:00: mysql.input: {"user":"toyama2","dummy":"hogehoge"}
267 | 2016-09-26 18:42:19+09:00: mysql.input: {"user":"toyama3","dummy":"hogehoge"}
268 | ```
269 |
270 | then `created_at` column is set from time attribute in a fluentd packet:
271 |
272 | ```sql
273 | mysql> select * from users_mysql_input;
274 | +----+-----------+---------------------+
275 | | id | user_name | created_at |
276 | +----+-----------+---------------------+
277 | | 1 | toyama | 2016-09-26 18:42:13 |
278 | | 2 | toyama2 | 2016-09-26 18:42:16 |
279 | | 3 | toyama3 | 2016-09-26 18:42:19 |
280 | +----+-----------+---------------------+
281 | 3 rows in set (0.00 sec)
282 | ```
283 |
284 | ## Configuration Example(bulk insert with time format placeholder for table name)
285 |
286 | This description is for v0.14.X users.
287 |
288 | ```
289 |
290 | @type mysql_bulk
291 | host localhost
292 | database test_app_development
293 | username root
294 | password hogehoge
295 | column_names id,user_name,created_at
296 | key_names id,user,${time}
297 | table users_%Y%m%d
298 |
299 | @type memory
300 | timekey 60s
301 | timekey_wait 60s
302 |
303 |
304 | ```
305 |
306 | Assume following input is coming:
307 |
308 | ```js
309 | 2016-09-26 18:37:06+09:00: mysql.input: {"user":"toyama","dummy":"hogehoge"}
310 | 2016-09-26 18:37:08+09:00: mysql.input: {"user":"toyama2","dummy":"hogehoge"}
311 | 2016-09-26 18:37:11+09:00: mysql.input: {"user":"toyama3","dummy":"hogehoge"}
312 | ```
313 |
314 | then `created_at` column is set from time attribute in a fluentd packet:
315 |
316 | ```sql
317 | mysql> select * from users_20160926;
318 | +----+-----------+---------------------+
319 | | id | user_name | created_at |
320 | +----+-----------+---------------------+
321 | | 1 | toyama | 2016-09-26 18:37:06 |
322 | | 2 | toyama2 | 2016-09-26 18:37:08 |
323 | | 3 | toyama3 | 2016-09-26 18:37:11 |
324 | +----+-----------+---------------------+
325 | 3 rows in set (0.00 sec)
326 | ```
327 |
328 | ## spec
329 |
330 | ```
331 | bundle install
332 | rake test
333 | ```
334 |
335 | ## todo
336 |
337 | divide bulk insert(exsample 1000 per)
338 |
339 |
340 | ## Contributing
341 |
342 | 1. Fork it
343 | 2. Create your feature branch (`git checkout -b my-new-feature`)
344 | 3. Commit your changes (`git commit -am 'Add some feature'`)
345 | 4. Push to the branch (`git push origin my-new-feature`)
346 | 5. Create new [Pull Request](../../pull/new/master)
347 |
348 | ## Copyright
349 |
350 | Copyright (c) 2016 Hiroshi Toyama. See [LICENSE](LICENSE.txt) for details.
351 |
--------------------------------------------------------------------------------
/README_mysql.md:
--------------------------------------------------------------------------------
1 | # fluent-plugin-mysql
2 |
3 | ## Component
4 |
5 | ### MysqlOutput
6 |
7 | [Fluentd](http://fluentd.org) plugin to store mysql tables over SQL, to each columns per values, or to single column as json.
8 |
9 | ## Configuration
10 |
11 | ### MysqlOutput
12 |
13 | MysqlOutput needs MySQL server's host/port/database/username/password, and INSERT format as SQL, or as table name and columns.
14 |
15 |
16 | type mysql
17 | host master.db.service.local
18 | # port 3306 # default
19 | database application_logs
20 | username myuser
21 | password mypass
22 | key_names status,bytes,vhost,path,rhost,agent,referer
23 | sql INSERT INTO accesslog (status,bytes,vhost,path,rhost,agent,referer) VALUES (?,?,?,?,?,?,?)
24 | flush_interval 5s
25 |
26 |
27 |
28 | type mysql
29 | host master.db.service.local
30 | database application_logs
31 | username myuser
32 | password mypass
33 | key_names status,bytes,vhost,path,rhost,agent,referer
34 | table accesslog
35 | # 'columns' names order must be same with 'key_names'
36 | columns status,bytes,vhost,path,rhost,agent,referer
37 | flush_interval 5s
38 |
39 |
40 | Or, insert json into single column.
41 |
42 |
43 | type mysql
44 | host master.db.service.local
45 | database application_logs
46 | username root
47 | table accesslog
48 | columns jsondata
49 | format json
50 | flush_interval 5s
51 |
52 |
53 | To include time/tag into output, use `include_time_key` and `include_tag_key`, like this:
54 |
55 |
56 | type mysql
57 | host my.mysql.local
58 | database anydatabase
59 | username yourusername
60 | password secret
61 |
62 | include_time_key yes
63 | ### default `time_format` is ISO-8601
64 | # time_format %Y%m%d-%H%M%S
65 | ### default `time_key` is 'time'
66 | # time_key timekey
67 |
68 | include_tag_key yes
69 | ### default `tag_key` is 'tag'
70 | # tag_key tagkey
71 |
72 | table anydata
73 | key_names time,tag,field1,field2,field3,field4
74 | sql INSERT INTO baz (coltime,coltag,col1,col2,col3,col4) VALUES (?,?,?,?,?,?)
75 |
76 |
77 | Or, for json:
78 |
79 |
80 | type mysql
81 | host database.local
82 | database foo
83 | username root
84 |
85 | include_time_key yes
86 | utc # with UTC timezone output (default: localtime)
87 | time_format %Y%m%d-%H%M%S
88 | time_key timeattr
89 |
90 | include_tag_key yes
91 | tag_key tagattr
92 | table accesslog
93 | columns jsondata
94 | format json
95 |
96 | #=> inserted json data into column 'jsondata' with addtional attribute 'timeattr' and 'tagattr'
97 |
98 | ### JsonPath format
99 |
100 | You can use [JsonPath](http://goessner.net/articles/JsonPath/) selectors as key_names, such as:
101 |
102 |
103 | type mysql
104 | host database.local
105 | database foo
106 | username bar
107 |
108 | include_time_key yes
109 | utc
110 | include_tag_key yes
111 | table baz
112 |
113 | format jsonpath
114 | key_names time, tag, id, data.name, tags[0]
115 | sql INSERT INTO baz (coltime,coltag,id,name,tag1) VALUES (?,?,?,?,?)
116 |
117 |
118 | Which for a record like:
119 |
120 | `{ 'id' => 15, 'data'=> {'name' => 'jsonpath' }, 'tags' => ['unit', 'simple'] }`
121 |
122 | will generate the following insert values:
123 |
124 | `('2012-12-17T01:23:45Z','test',15,'jsonpath','unit')`
125 |
126 | ## Prerequisites
127 |
128 | `fluent-plugin-mysql` uses `mysql2` gem, and `mysql2` links against `libmysqlclient`. See [Installing](https://github.com/brianmario/mysql2#installing) for its installation.
129 |
130 | ## TODO
131 |
132 | * implement 'tag_mapped'
133 | * dynamic tag based table selection
134 |
135 | ## Copyright
136 |
137 | * Copyright
138 | * Copyright(C) 2016- TAGOMORI Satoshi (tagomoris)
139 | * License
140 | * Apache License, Version 2.0
141 |
--------------------------------------------------------------------------------
/Rakefile:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env rake
2 | require "bundler/gem_tasks"
3 |
4 | require 'rake/testtask'
5 | Rake::TestTask.new(:test) do |test|
6 | test.libs << 'lib' << 'test'
7 | test.pattern = 'test/**/test_*.rb'
8 | test.verbose = true
9 | end
10 |
11 | task :default => :test
12 |
--------------------------------------------------------------------------------
/fluent-plugin-mysql.gemspec:
--------------------------------------------------------------------------------
1 | # -*- encoding: utf-8 -*-
2 | Gem::Specification.new do |gem|
3 | gem.name = "fluent-plugin-mysql"
4 | gem.version = "0.3.4"
5 | gem.authors = ["TAGOMORI Satoshi", "Toyama Hiroshi"]
6 | gem.email = ["tagomoris@gmail.com", "toyama0919@gmail.com"]
7 | gem.description = %q{fluent plugin to insert mysql as json(single column) or insert statement}
8 | gem.summary = %q{fluent plugin to insert mysql}
9 | gem.homepage = "https://github.com/tagomoris/fluent-plugin-mysql"
10 | gem.license = "Apache-2.0"
11 |
12 | gem.files = `git ls-files`.split($\)
13 | gem.executables = gem.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
14 | gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
15 | gem.require_paths = ["lib"]
16 |
17 | gem.add_runtime_dependency "fluentd", ['>= 0.14.8', '< 2']
18 | gem.add_runtime_dependency "mysql2-cs-bind"
19 | gem.add_runtime_dependency "jsonpath"
20 | gem.add_runtime_dependency "oj"
21 | gem.add_development_dependency "rake"
22 | gem.add_development_dependency "test-unit"
23 | gem.add_development_dependency "timecop", "~> 0.8.0"
24 | end
25 |
--------------------------------------------------------------------------------
/lib/fluent/plugin/out_mysql.rb:
--------------------------------------------------------------------------------
1 | class Fluent::MysqlOutput < Fluent::BufferedOutput
2 | Fluent::Plugin.register_output('mysql', self)
3 |
4 | include Fluent::SetTimeKeyMixin
5 | include Fluent::SetTagKeyMixin
6 |
7 | config_param :host, :string
8 | config_param :port, :integer, :default => nil
9 | config_param :database, :string
10 | config_param :username, :string
11 | config_param :password, :string, :default => '', :secret => true
12 | config_param :sslkey, :string, :default => nil
13 | config_param :sslcert, :string, :default => nil
14 | config_param :sslca, :string, :default => nil
15 | config_param :sslcapath, :string, :default => nil
16 | config_param :sslcipher, :string, :default => nil
17 | config_param :sslverify, :bool, :default => nil
18 |
19 | config_param :key_names, :string, :default => nil # nil allowed for json format
20 | config_param :sql, :string, :default => nil
21 | config_param :table, :string, :default => nil
22 | config_param :columns, :string, :default => nil
23 |
24 | config_param :format, :string, :default => "raw" # or json
25 |
26 | attr_accessor :handler
27 |
28 | def initialize
29 | super
30 | require 'mysql2-cs-bind'
31 | require 'jsonpath'
32 | end
33 |
34 | # Define `log` method for v0.10.42 or earlier
35 | unless method_defined?(:log)
36 | define_method("log") { $log }
37 | end
38 |
39 | def configure(conf)
40 | super
41 |
42 | log.warn "[mysql] This plugin deprecated. You should use mysql_bulk."
43 |
44 | # TODO tag_mapped
45 |
46 | case @format
47 | when 'json'
48 | @format_proc = Proc.new{|tag, time, record| record.to_json}
49 | when 'jsonpath'
50 | @key_names = @key_names.split(/\s*,\s*/)
51 | @format_proc = Proc.new do |tag, time, record|
52 | json = record.to_json
53 | @key_names.map do |k|
54 | JsonPath.new(k.strip).on(json).first
55 | end
56 | end
57 | else
58 | @key_names = @key_names.split(/\s*,\s*/)
59 | @format_proc = Proc.new{|tag, time, record| @key_names.map{|k| record[k]}}
60 | end
61 |
62 | if @columns.nil? and @sql.nil?
63 | raise Fluent::ConfigError, "columns or sql MUST be specified, but missing"
64 | end
65 | if @columns and @sql
66 | raise Fluent::ConfigError, "both of columns and sql are specified, but specify one of them"
67 | end
68 |
69 | if @sql
70 | begin
71 | if @format == 'json'
72 | Mysql2::Client.pseudo_bind(@sql, [nil])
73 | else
74 | Mysql2::Client.pseudo_bind(@sql, @key_names.map{|n| nil})
75 | end
76 | rescue ArgumentError => e
77 | raise Fluent::ConfigError, "mismatch between sql placeholders and key_names"
78 | end
79 | else # columns
80 | raise Fluent::ConfigError, "table missing" unless @table
81 | @columns = @columns.split(/\s*,\s*/)
82 | cols = @columns.join(',')
83 | placeholders = if @format == 'json'
84 | '?'
85 | else
86 | @key_names.map{|k| '?'}.join(',')
87 | end
88 | @sql = "INSERT INTO #{@table} (#{cols}) VALUES (#{placeholders})"
89 | end
90 | end
91 |
92 | def start
93 | super
94 | end
95 |
96 | def shutdown
97 | super
98 | end
99 |
100 | def format(tag, time, record)
101 | [tag, time, @format_proc.call(tag, time, record)].to_msgpack
102 | end
103 |
104 | def client
105 | Mysql2::Client.new({
106 | :host => @host, :port => @port,
107 | :username => @username, :password => @password,
108 | :database => @database,
109 | :sslkey => @sslkey,
110 | :sslcert => @sslcert,
111 | :sslca => @sslca,
112 | :sslcapath => @sslcapath,
113 | :sslcipher => @sslcipher,
114 | :sslverify => @sslverify,
115 | :flags => Mysql2::Client::MULTI_STATEMENTS,
116 | })
117 | end
118 |
119 | def write(chunk)
120 | handler = self.client
121 | chunk.msgpack_each { |tag, time, data|
122 | handler.xquery(@sql, data)
123 | }
124 | handler.close
125 | end
126 | end
127 |
--------------------------------------------------------------------------------
/lib/fluent/plugin/out_mysql_bulk.rb:
--------------------------------------------------------------------------------
1 | require 'fluent/plugin/output'
2 | require 'oj'
3 |
4 | module Fluent::Plugin
5 | class MysqlBulkOutput < Output
6 | Fluent::Plugin.register_output('mysql_bulk', self)
7 |
8 | helpers :compat_parameters, :inject
9 |
10 | config_param :host, :string, default: '127.0.0.1',
11 | desc: "Database host."
12 | config_param :port, :integer, default: 3306,
13 | desc: "Database port."
14 | config_param :database, :string,
15 | desc: "Database name."
16 | config_param :username, :string,
17 | desc: "Database user."
18 | config_param :password, :string, default: '', secret: true,
19 | desc: "Database password."
20 | config_param :sslkey, :string, default: nil,
21 | desc: "SSL key."
22 | config_param :sslcert, :string, default: nil,
23 | desc: "SSL cert."
24 | config_param :sslca, :string, default: nil,
25 | desc: "SSL CA."
26 | config_param :sslcapath, :string, default: nil,
27 | desc: "SSL CA path."
28 | config_param :sslcipher, :string, default: nil,
29 | desc: "SSL cipher."
30 | config_param :sslverify, :bool, default: nil,
31 | desc: "SSL Verify Server Certificate."
32 |
33 | config_param :column_names, :string,
34 | desc: "Bulk insert column."
35 | config_param :key_names, :string, default: nil,
36 | desc: <<-DESC
37 | Value key names, ${time} is placeholder Time.at(time).strftime("%Y-%m-%d %H:%M:%S").
38 | DESC
39 | config_param :json_key_names, :string, default: nil,
40 | desc: "Key names which store data as json"
41 | config_param :table, :string,
42 | desc: "Bulk insert table."
43 |
44 | config_param :unixtimestamp_key_names, :string, default: nil,
45 | desc: "Key names which store data as datetime from unix time stamp"
46 |
47 | config_param :on_duplicate_key_update, :bool, default: false,
48 | desc: "On duplicate key update enable."
49 | config_param :on_duplicate_update_keys, :string, default: nil,
50 | desc: "On duplicate key update column, comma separator."
51 | config_param :on_duplicate_update_custom_values, :string, default: nil,
52 | desc: "On_duplicate_update_custom_values, comma separator. specify the column name is insert value, custom value is use ${sql conditions}"
53 |
54 | config_param :transaction_isolation_level, :enum, list: [:read_uncommitted, :read_committed, :repeatable_read, :serializable], default: nil,
55 | desc: "Set transaction isolation level."
56 |
57 | attr_accessor :handler
58 |
59 | def initialize
60 | super
61 | require 'mysql2-cs-bind'
62 | end
63 |
64 | def configure(conf)
65 | compat_parameters_convert(conf, :buffer, :inject)
66 | super
67 |
68 | if @column_names.nil?
69 | fail Fluent::ConfigError, 'column_names MUST specified, but missing'
70 | end
71 |
72 | if @on_duplicate_key_update
73 | if @on_duplicate_update_keys.nil?
74 | fail Fluent::ConfigError, 'on_duplicate_key_update = true , on_duplicate_update_keys nil!'
75 | end
76 | @on_duplicate_update_keys = @on_duplicate_update_keys.split(',')
77 |
78 | if !@on_duplicate_update_custom_values.nil?
79 | @on_duplicate_update_custom_values = @on_duplicate_update_custom_values.split(',')
80 | if @on_duplicate_update_custom_values.length != @on_duplicate_update_keys.length
81 | fail Fluent::ConfigError, <<-DESC
82 | on_duplicate_update_keys and on_duplicate_update_custom_values must be the same length
83 | DESC
84 | end
85 | end
86 |
87 | @on_duplicate_key_update_sql = ' ON DUPLICATE KEY UPDATE '
88 | updates = []
89 | @on_duplicate_update_keys.each_with_index do |update_column, i|
90 | if @on_duplicate_update_custom_values.nil? || @on_duplicate_update_custom_values[i] == "#{update_column}"
91 | updates << "#{update_column} = VALUES(#{update_column})"
92 | else
93 | value = @on_duplicate_update_custom_values[i].to_s.match(/\${(.*)}/)[1]
94 | escape_value = Mysql2::Client.escape(value)
95 | updates << "#{update_column} = #{escape_value}"
96 | end
97 | end
98 | @on_duplicate_key_update_sql += updates.join(',')
99 | end
100 |
101 | @column_names = @column_names.split(',').collect(&:strip)
102 | @key_names = @key_names.nil? ? @column_names : @key_names.split(',').collect(&:strip)
103 | @json_key_names = @json_key_names.split(',') if @json_key_names
104 | @unixtimestamp_key_names = @unixtimestamp_key_names.split(',') if @unixtimestamp_key_names
105 | end
106 |
107 | def check_table_schema(database: @database, table: @table)
108 | _client = client(database)
109 | result = _client.xquery("SHOW COLUMNS FROM #{table}")
110 | max_lengths = []
111 | @column_names.each do |column|
112 | info = result.select { |x| x['Field'] == column }.first
113 | r = /(char|varchar)\(([\d]+)\)/
114 | begin
115 | max_length = info['Type'].scan(r)[0][1].to_i
116 | rescue
117 | max_length = nil
118 | end
119 | max_lengths << max_length
120 | end
121 | max_lengths
122 | ensure
123 | if not _client.nil? then _client.close end
124 | end
125 |
126 | def format(tag, time, record)
127 | record = inject_values_to_record(tag, time, record)
128 | [tag, time, record].to_msgpack
129 | end
130 |
131 | def formatted_to_msgpack_binary
132 | true
133 | end
134 |
135 | def multi_workers_ready?
136 | true
137 | end
138 |
139 | def client(database)
140 | Mysql2::Client.new(
141 | host: @host,
142 | port: @port,
143 | username: @username,
144 | password: @password,
145 | database: database,
146 | sslkey: @sslkey,
147 | sslcert: @sslcert,
148 | sslca: @sslca,
149 | sslcapath: @sslcapath,
150 | sslcipher: @sslcipher,
151 | sslverify: @sslverify,
152 | flags: Mysql2::Client::MULTI_STATEMENTS
153 | )
154 | end
155 |
156 | def expand_placeholders(metadata)
157 | database = extract_placeholders(@database, metadata).gsub('.', '_')
158 | table = extract_placeholders(@table, metadata).gsub('.', '_')
159 | return database, table
160 | end
161 |
162 | def write(chunk)
163 | database, table = expand_placeholders(chunk.metadata)
164 | max_lengths = check_table_schema(database: database, table: table)
165 | @handler = client(database)
166 | values = []
167 | values_template = "(#{ @column_names.map { |key| '?' }.join(',') })"
168 | chunk.msgpack_each do |tag, time, data|
169 | data = format_proc.call(tag, time, data, max_lengths)
170 | values << Mysql2::Client.pseudo_bind(values_template, data)
171 | end
172 | sql = "INSERT INTO #{table} (#{@column_names.map{|x| "`#{x.to_s.gsub('`', '``')}`"}.join(',')}) VALUES #{values.join(',')}"
173 | sql += @on_duplicate_key_update_sql if @on_duplicate_key_update
174 |
175 | log.info "bulk insert values size (table: #{table}) => #{values.size}"
176 | @handler.query("SET SESSION TRANSACTION ISOLATION LEVEL #{transaction_isolation_level}") if @transaction_isolation_level
177 | @handler.xquery(sql)
178 | @handler.close
179 | end
180 |
181 | private
182 |
183 | def format_proc
184 | proc do |tag, time, record, max_lengths|
185 | values = []
186 | @key_names.each_with_index do |key, i|
187 | if key == '${time}'
188 | value = Time.at(time).strftime('%Y-%m-%d %H:%M:%S')
189 | else
190 | if max_lengths[i].nil? || record[key].nil?
191 | value = record[key]
192 | else
193 | value = record[key].to_s.slice(0, max_lengths[i])
194 | end
195 |
196 | if @json_key_names && @json_key_names.include?(key)
197 | value = Oj.dump(value)
198 | end
199 |
200 | if @unixtimestamp_key_names && @unixtimestamp_key_names.include?(key)
201 | value = Time.at(value).strftime('%Y-%m-%d %H:%M:%S')
202 | end
203 | end
204 | values << value
205 | end
206 | values
207 | end
208 | end
209 |
210 | def transaction_isolation_level
211 | case @transaction_isolation_level
212 | when :read_uncommitted
213 | "READ UNCOMMITTED"
214 | when :read_committed
215 | "READ COMMITTED"
216 | when :repeatable_read
217 | "REPEATABLE READ"
218 | when :serializable
219 | "SERIALIZABLE"
220 | end
221 | end
222 | end
223 | end
224 |
--------------------------------------------------------------------------------
/test/helper.rb:
--------------------------------------------------------------------------------
1 | require 'rubygems'
2 | require 'bundler'
3 | begin
4 | Bundler.setup(:default, :development)
5 | rescue Bundler::BundlerError => e
6 | $stderr.puts e.message
7 | $stderr.puts "Run `bundle install` to install missing gems"
8 | exit e.status_code
9 | end
10 | require 'test/unit'
11 |
12 | $LOAD_PATH.unshift(File.join(File.dirname(__FILE__), '..', 'lib'))
13 | $LOAD_PATH.unshift(File.dirname(__FILE__))
14 | require 'fluent/test'
15 | unless ENV.has_key?('VERBOSE')
16 | nulllogger = Object.new
17 | nulllogger.instance_eval {|obj|
18 | def method_missing(method, *args)
19 | # pass
20 | end
21 | }
22 | $log = nulllogger
23 | end
24 |
25 | require 'fluent/plugin/out_mysql'
26 | require 'fluent/plugin/out_mysql_bulk'
27 |
28 | class Test::Unit::TestCase
29 | end
30 |
--------------------------------------------------------------------------------
/test/plugin/test_out_mysql.rb:
--------------------------------------------------------------------------------
1 | require 'helper'
2 | require 'mysql2-cs-bind'
3 |
4 | class MysqlOutputTest < Test::Unit::TestCase
5 | def setup
6 | Fluent::Test.setup
7 | end
8 |
9 | CONFIG = %[
10 | host db.local
11 | database testing
12 | username testuser
13 | sql INSERT INTO tbl SET jsondata=?
14 | format json
15 | ]
16 |
17 | def create_driver(conf = CONFIG, tag='test')
18 | d = Fluent::Test::BufferedOutputTestDriver.new(Fluent::MysqlOutput, tag).configure(conf)
19 | d.instance.instance_eval {
20 | def client
21 | obj = Object.new
22 | obj.instance_eval {
23 | def xquery(*args); [1]; end
24 | def close; true; end
25 | }
26 | obj
27 | end
28 | }
29 | d
30 | end
31 |
32 | def test_configure
33 | d = create_driver %[
34 | host database.local
35 | database foo
36 | username bar
37 | sql INSERT INTO baz SET jsondata=?
38 | format json
39 | ]
40 | d = create_driver %[
41 | host database.local
42 | database foo
43 | username bar
44 | table baz
45 | columns jsondata
46 | format json
47 | ]
48 | d = create_driver %[
49 | host database.local
50 | database foo
51 | username bar
52 | password mogera
53 | key_names field1,field2,field3
54 | table baz
55 | columns col1, col2 ,col3
56 | ]
57 | assert_equal ['field1', 'field2', 'field3'], d.instance.key_names
58 | assert_equal 'INSERT INTO baz (col1,col2,col3) VALUES (?,?,?)', d.instance.sql
59 | d = create_driver %[
60 | host database.local
61 | database foo
62 | username bar
63 | password mogera
64 | key_names field1 ,field2, field3
65 | table baz
66 | columns col1, col2 ,col3
67 | ]
68 | assert_equal ['field1', 'field2', 'field3'], d.instance.key_names
69 | assert_equal 'INSERT INTO baz (col1,col2,col3) VALUES (?,?,?)', d.instance.sql
70 |
71 | assert_raise(Fluent::ConfigError) {
72 | d = create_driver %[
73 | host database.local
74 | database foo
75 | username bar
76 | password mogera
77 | key_names field1,field2,field3
78 | sql INSERT INTO baz (col1,col2,col3,col4) VALUES (?,?,?,?)
79 | ]
80 | }
81 |
82 |
83 | end
84 |
85 | def test_format
86 | d = create_driver
87 |
88 | time = Time.parse("2011-01-02 13:14:15 UTC").to_i
89 | d.emit({"a"=>1}, time)
90 | d.emit({"a"=>2}, time)
91 |
92 | #d.expect_format %[2011-01-02T13:14:15Z\ttest\t{"a":1}\n]
93 | #d.expect_format %[2011-01-02T13:14:15Z\ttest\t{"a":2}\n]
94 | d.expect_format ['test', time, {"a" => 1}.to_json].to_msgpack
95 | d.expect_format ['test', time, {"a" => 2}.to_json].to_msgpack
96 |
97 | d.run
98 | end
99 |
100 | def test_time_and_tag_key
101 | d = create_driver %[
102 | host database.local
103 | database foo
104 | username bar
105 | password mogera
106 | include_time_key yes
107 | utc
108 | include_tag_key yes
109 | table baz
110 | key_names time,tag,field1,field2,field3,field4
111 | sql INSERT INTO baz (coltime,coltag,col1,col2,col3,col4) VALUES (?,?,?,?,?,?)
112 | ]
113 | assert_equal 'INSERT INTO baz (coltime,coltag,col1,col2,col3,col4) VALUES (?,?,?,?,?,?)', d.instance.sql
114 |
115 | time = Time.parse('2012-12-17 01:23:45 UTC').to_i
116 | record = {'field1'=>'value1','field2'=>'value2','field3'=>'value3','field4'=>'value4'}
117 | d.emit(record, time)
118 | d.expect_format ['test', time, ['2012-12-17T01:23:45Z','test','value1','value2','value3','value4']].to_msgpack
119 | d.run
120 | end
121 |
122 | def test_time_and_tag_key_complex
123 | d = create_driver %[
124 | host database.local
125 | database foo
126 | username bar
127 | password mogera
128 | include_time_key yes
129 | utc
130 | time_format %Y%m%d-%H%M%S
131 | time_key timekey
132 | include_tag_key yes
133 | tag_key tagkey
134 | table baz
135 | key_names timekey,tagkey,field1,field2,field3,field4
136 | sql INSERT INTO baz (coltime,coltag,col1,col2,col3,col4) VALUES (?,?,?,?,?,?)
137 | ]
138 | assert_equal 'INSERT INTO baz (coltime,coltag,col1,col2,col3,col4) VALUES (?,?,?,?,?,?)', d.instance.sql
139 |
140 | time = Time.parse('2012-12-17 09:23:45 +0900').to_i # JST(+0900)
141 | record = {'field1'=>'value1','field2'=>'value2','field3'=>'value3','field4'=>'value4'}
142 | d.emit(record, time)
143 | d.expect_format ['test', time, ['20121217-002345','test','value1','value2','value3','value4']].to_msgpack
144 | d.run
145 | end
146 |
147 | def test_time_and_tag_key_json
148 | d = create_driver %[
149 | host database.local
150 | database foo
151 | username bar
152 | password mogera
153 | include_time_key yes
154 | utc
155 | time_format %Y%m%d-%H%M%S
156 | time_key timekey
157 | include_tag_key yes
158 | tag_key tagkey
159 | table accesslog
160 | columns jsondata
161 | format json
162 | ]
163 | assert_equal 'INSERT INTO accesslog (jsondata) VALUES (?)', d.instance.sql
164 |
165 | time = Time.parse('2012-12-17 09:23:45 +0900').to_i # JST(+0900)
166 | record = {'field1'=>'value1'}
167 | d.emit(record, time)
168 | # Ruby 1.9.3 Hash saves its key order, so this code is OK.
169 | d.expect_format ['test', time, record.merge({'timekey'=>'20121217-002345','tagkey'=>'test'}).to_json].to_msgpack
170 | d.run
171 | end
172 |
173 | def test_jsonpath_format
174 | d = create_driver %[
175 | host database.local
176 | database foo
177 | username bar
178 | password mogera
179 | include_time_key yes
180 | utc
181 | include_tag_key yes
182 | table baz
183 | format jsonpath
184 | key_names time, tag, id, data.name, tags[0]
185 | sql INSERT INTO baz (coltime,coltag,id,name,tag1) VALUES (?,?,?,?,?)
186 | ]
187 | assert_equal 'INSERT INTO baz (coltime,coltag,id,name,tag1) VALUES (?,?,?,?,?)', d.instance.sql
188 |
189 | time = Time.parse('2012-12-17 01:23:45 UTC').to_i
190 | record = { 'id' => 15, 'data'=> {'name' => 'jsonpath' }, 'tags' => ['unit', 'simple'] }
191 | d.emit(record, time)
192 | d.expect_format ['test', time, ['2012-12-17T01:23:45Z','test',15,'jsonpath','unit']].to_msgpack
193 | d.run
194 | end
195 |
196 | def test_write
197 | # hmm....
198 | end
199 | end
200 |
--------------------------------------------------------------------------------
/test/plugin/test_out_mysql_bulk.rb:
--------------------------------------------------------------------------------
1 | # coding: utf-8
2 | require 'helper'
3 | require 'mysql2-cs-bind'
4 | require 'fluent/test/driver/output'
5 | require 'fluent/plugin/buffer'
6 | require 'fluent/config'
7 | require 'time'
8 | require 'timecop'
9 |
10 | class MysqlBulkOutputTest < Test::Unit::TestCase
11 | def setup
12 | Fluent::Test.setup
13 | end
14 |
15 | def config_element(name = 'test', argument = '', params = {}, elements = [])
16 | Fluent::Config::Element.new(name, argument, params, elements)
17 | end
18 |
19 | CONFIG = %[
20 | database test_app_development
21 | username root
22 | password hogehoge
23 | column_names id,user_name,created_at
24 | key_names id,users,created_at
25 | table users
26 | ]
27 |
28 | def create_driver(conf = CONFIG)
29 | d = Fluent::Test::Driver::Output.new(Fluent::Plugin::MysqlBulkOutput).configure(conf)
30 | d.instance.instance_eval {
31 | def client
32 | obj = Object.new
33 | obj.instance_eval {
34 | def xquery(*args); [1]; end
35 | def close; true; end
36 | }
37 | obj
38 | end
39 | }
40 | d
41 | end
42 |
43 | def create_metadata(timekey: nil, tag: nil, variables: nil)
44 | Fluent::Plugin::Buffer::Metadata.new(timekey, tag, variables)
45 | end
46 |
47 | class TestExpandPlaceholders < self
48 | data("table" => {"database" => "test_app_development",
49 | "table" => "users_${tag}",
50 | "extracted_database" => "test_app_development",
51 | "extracted_table" => "users_input_mysql"
52 | },
53 | "database" => {"database" => "test_app_development_${tag}",
54 | "table" => "users",
55 | "extracted_database" => "test_app_development_input_mysql",
56 | "extracted_table" => "users"
57 | },
58 | )
59 | def test_expand_tag_placeholder(data)
60 | config = config_element('ROOT', '', {
61 | "@type" => "mysql_bulk",
62 | "host" => "localhost",
63 | "database" => data["database"],
64 | "username" => "root",
65 | "password" => "hogehoge",
66 | "column_names" => "id,user_name,created_at",
67 | "table" => data["table"],
68 | }, [config_element('buffer', 'tag', {
69 | "@type" => "memory",
70 | "flush_interval" => "60s",
71 | }, [])])
72 | d = create_driver(config)
73 | time = Time.now
74 | metadata = create_metadata(timekey: time.to_i, tag: 'input.mysql')
75 | database, table = d.instance.expand_placeholders(metadata)
76 | assert_equal(data["extracted_database"], database)
77 | assert_equal(data["extracted_table"], table)
78 | end
79 |
80 | def setup
81 | Timecop.freeze(Time.parse("2016-09-26"))
82 | end
83 |
84 | data("table" => {"database" => "test_app_development",
85 | "table" => "users_%Y%m%d",
86 | "extracted_database" => "test_app_development",
87 | "extracted_table" => "users_20160926"
88 | },
89 | "database" => {"database" => "test_app_development_%Y%m%d",
90 | "table" => "users",
91 | "extracted_database" => "test_app_development_20160926",
92 | "extracted_table" => "users"
93 | },
94 | )
95 | def test_expand_time_placeholder(data)
96 | config = config_element('ROOT', '', {
97 | "@type" => "mysql_bulk",
98 | "host" => "localhost",
99 | "database" => data["database"],
100 | "username" => "root",
101 | "password" => "hogehoge",
102 | "column_names" => "id,user_name,created_at",
103 | "table" => data["table"],
104 | }, [config_element('buffer', 'time', {
105 | "@type" => "memory",
106 | "timekey" => "60s",
107 | "timekey_wait" => "60s"
108 | }, [])])
109 | d = create_driver(config)
110 | time = Time.now
111 | metadata = create_metadata(timekey: time.to_i, tag: 'input.mysql')
112 | database, table = d.instance.expand_placeholders(metadata)
113 | assert_equal(data["extracted_database"], database)
114 | assert_equal(data["extracted_table"], table)
115 | end
116 |
117 | def teardown
118 | Timecop.return
119 | end
120 | end
121 |
122 | def test_configure_error
123 | assert_raise(Fluent::ConfigError) do
124 | create_driver %[
125 | host localhost
126 | database test_app_development
127 | username root
128 | password hogehoge
129 | table users
130 | on_duplicate_key_update true
131 | on_duplicate_update_keys user_name,updated_at
132 | flush_interval 10s
133 | ]
134 | end
135 |
136 | assert_raise(Fluent::ConfigError) do
137 | create_driver %[
138 | host localhost
139 | database test_app_development
140 | username root
141 | password hogehoge
142 | column_names id,user_name,created_at,updated_at
143 | table users
144 | on_duplicate_key_update true
145 | flush_interval 10s
146 | ]
147 | end
148 |
149 | assert_raise(Fluent::ConfigError) do
150 | create_driver %[
151 | host localhost
152 | username root
153 | password hogehoge
154 | column_names id,user_name,created_at,updated_at
155 | table users
156 | on_duplicate_key_update true
157 | on_duplicate_update_keys user_name,updated_at
158 | flush_interval 10s
159 | ]
160 | end
161 |
162 | assert_raise(Fluent::ConfigError) do
163 | create_driver %[
164 | host localhost
165 | username root
166 | password hogehoge
167 | column_names id,user_name,login_count,created_at,updated_at
168 | table users
169 | on_duplicate_key_update true
170 | on_duplicate_update_keys login_count,updated_at
171 | on_duplicate_update_custom_values login_count
172 | flush_interval 10s
173 | ]
174 | end
175 | end
176 |
177 | def test_configure
178 | # not define format(default csv)
179 | assert_nothing_raised(Fluent::ConfigError) do
180 | create_driver %[
181 | host localhost
182 | database test_app_development
183 | username root
184 | password hogehoge
185 | column_names id,user_name,created_at,updated_at
186 | table users
187 | on_duplicate_key_update true
188 | on_duplicate_update_keys user_name,updated_at
189 | flush_interval 10s
190 | ]
191 | end
192 |
193 | assert_nothing_raised(Fluent::ConfigError) do
194 | create_driver %[
195 | database test_app_development
196 | username root
197 | password hogehoge
198 | column_names id,user_name,created_at,updated_at
199 | table users
200 | ]
201 | end
202 |
203 | assert_nothing_raised(Fluent::ConfigError) do
204 | create_driver %[
205 | database test_app_development
206 | username root
207 | password hogehoge
208 | column_names id,user_name,created_at,updated_at
209 | table users
210 | on_duplicate_key_update true
211 | on_duplicate_update_keys user_name,updated_at
212 | ]
213 | end
214 |
215 | assert_nothing_raised(Fluent::ConfigError) do
216 | create_driver %[
217 | database test_app_development
218 | username root
219 | password hogehoge
220 | column_names id,user_name,created_at,updated_at
221 | key_names id,user,created_date,updated_date
222 | table users
223 | on_duplicate_key_update true
224 | on_duplicate_update_keys user_name,updated_at
225 | ]
226 | end
227 |
228 | assert_nothing_raised(Fluent::ConfigError) do
229 | create_driver %[
230 | database test_app_development
231 | username root
232 | password hogehoge
233 | key_names id,url,request_headers,params,created_at,updated_at
234 | column_names id,url,request_headers_json,params_json,created_date,updated_date
235 | json_key_names request_headers,params
236 | table access
237 | ]
238 | end
239 |
240 | assert_nothing_raised(Fluent::ConfigError) do
241 | create_driver %[
242 | database test_app_development
243 | username root
244 | password hogehoge
245 | column_names id,user_name,login_count,created_at,updated_at
246 | key_names id,user_name,login_count,created_date,updated_date
247 | table users
248 | on_duplicate_key_update true
249 | on_duplicate_update_keys login_count,updated_at
250 | on_duplicate_update_custom_values ${`login_count` + 1},updated_at
251 | ]
252 | end
253 | end
254 |
255 | def test_variables
256 | d = create_driver %[
257 | database test_app_development
258 | username root
259 | password hogehoge
260 | column_names id,user_name,created_at,updated_at
261 | table users
262 | on_duplicate_key_update true
263 | on_duplicate_update_keys user_name,updated_at
264 | ]
265 |
266 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.key_names
267 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.column_names
268 | assert_equal nil, d.instance.json_key_names
269 | assert_equal nil, d.instance.unixtimestamp_key_names
270 | assert_equal " ON DUPLICATE KEY UPDATE user_name = VALUES(user_name),updated_at = VALUES(updated_at)", d.instance.instance_variable_get(:@on_duplicate_key_update_sql)
271 |
272 | d = create_driver %[
273 | database test_app_development
274 | username root
275 | password hogehoge
276 | column_names id,user_name,created_at,updated_at
277 | table users
278 | ]
279 |
280 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.key_names
281 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.column_names
282 | assert_equal nil, d.instance.json_key_names
283 | assert_equal nil, d.instance.unixtimestamp_key_names
284 | assert_nil d.instance.instance_variable_get(:@on_duplicate_key_update_sql)
285 |
286 | d = create_driver %[
287 | database test_app_development
288 | username root
289 | password hogehoge
290 | key_names id,user_name,created_at,updated_at
291 | column_names id,user,created_date,updated_date
292 | table users
293 | ]
294 |
295 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.key_names
296 | assert_equal ['id','user','created_date','updated_date'], d.instance.column_names
297 | assert_equal nil, d.instance.json_key_names
298 | assert_equal nil, d.instance.unixtimestamp_key_names
299 | assert_nil d.instance.instance_variable_get(:@on_duplicate_key_update_sql)
300 |
301 | d = create_driver %[
302 | database test_app_development
303 | username root
304 | password hogehoge
305 | key_names id,url,request_headers,params,created_at,updated_at
306 | column_names id,url,request_headers_json,params_json,created_date,updated_date
307 | unixtimestamp_key_names created_at,updated_at
308 | json_key_names request_headers,params
309 | table access
310 | ]
311 |
312 | assert_equal ['id','url','request_headers','params','created_at','updated_at'], d.instance.key_names
313 | assert_equal ['id','url','request_headers_json','params_json','created_date','updated_date'], d.instance.column_names
314 | assert_equal ['request_headers','params'], d.instance.json_key_names
315 | assert_equal ['created_at', 'updated_at'], d.instance.unixtimestamp_key_names
316 | assert_nil d.instance.instance_variable_get(:@on_duplicate_key_update_sql)
317 |
318 | d = create_driver %[
319 | database test_app_development
320 | username root
321 | password hogehoge
322 | column_names id,user_name,login_count,created_at,updated_at
323 | table users
324 | on_duplicate_key_update true
325 | on_duplicate_update_keys login_count,updated_at
326 | on_duplicate_update_custom_values ${`login_count` + 1},updated_at
327 | ]
328 |
329 | assert_equal ['id','user_name','login_count','created_at','updated_at'], d.instance.key_names
330 | assert_equal ['id','user_name','login_count','created_at','updated_at'], d.instance.column_names
331 | assert_equal nil, d.instance.json_key_names
332 | assert_equal nil, d.instance.unixtimestamp_key_names
333 | assert_equal " ON DUPLICATE KEY UPDATE login_count = `login_count` + 1,updated_at = VALUES(updated_at)", d.instance.instance_variable_get(:@on_duplicate_key_update_sql)
334 | end
335 |
336 | def test_spaces_in_columns
337 | d = create_driver %[
338 | database test_app_development
339 | username root
340 | password hogehoge
341 | column_names id, user_name, created_at, updated_at
342 | table users
343 | ]
344 |
345 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.key_names
346 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.column_names
347 |
348 | d = create_driver %[
349 | database test_app_development
350 | username root
351 | password hogehoge
352 | key_names id, user_name, created_at, updated_at
353 | column_names id, user_name, created_at, updated_at
354 | table users
355 | ]
356 |
357 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.key_names
358 | assert_equal ['id','user_name','created_at','updated_at'], d.instance.column_names
359 | end
360 | end
361 |
--------------------------------------------------------------------------------