├── .gitignore ├── README.md ├── batch ├── pom.xml └── src │ ├── main │ ├── assembly │ │ └── build-assembly.xml │ ├── bin │ │ └── launcher.sh │ ├── conf │ │ ├── employee.csv │ │ ├── employee.json │ │ └── log4j.xml │ └── java │ │ └── com │ │ └── ts │ │ └── blog │ │ └── batch │ │ ├── CategoryAssignmentApp.java │ │ ├── Launcher.java │ │ ├── dataset │ │ ├── DataSetOps.java │ │ └── Employee.java │ │ └── functions │ │ └── CategoryAssignment.java │ └── test │ └── resources │ └── testng.xml ├── img └── spark_java_drools_maven.jpg ├── input-data └── blog.csv ├── pom.xml └── rules-core ├── kie-rules.iml ├── pom.xml └── src └── main ├── java └── com │ └── ts │ └── blog │ └── rules │ ├── KieFunctions.java │ └── type │ └── Category.java └── resources ├── META-INF └── kmodule.xml └── com └── ts └── blog └── kie ├── BlogCategory.drl └── MetaData.drl /.gitignore: -------------------------------------------------------------------------------- 1 | # Created by .ignore support plugin (hsz.mobi) 2 | ### JDeveloper template 3 | # default application storage directory used by the IDE Performance Cache feature 4 | .data/ 5 | 6 | # used for ADF styles caching 7 | temp/ 8 | 9 | # default output directories 10 | classes/ 11 | deploy/ 12 | javadoc/ 13 | 14 | # lock file, a part of Oracle Credential Store Framework 15 | cwallet.sso.lck### Maven template 16 | target/ 17 | pom.xml.tag 18 | pom.xml.releaseBackup 19 | pom.xml.versionsBackup 20 | pom.xml.next 21 | release.properties 22 | dependency-reduced-pom.xml 23 | buildNumber.properties 24 | .mvn/timing.properties 25 | 26 | # Avoid ignoring Maven wrapper jar file (.jar files are usually ignored) 27 | !/.mvn/wrapper/maven-wrapper.jar 28 | ### Java template 29 | # Compiled class file 30 | *.class 31 | 32 | # Log file 33 | *.log 34 | 35 | # BlueJ files 36 | *.ctxt 37 | 38 | # Mobile Tools for Java (J2ME) 39 | .mtj.tmp/ 40 | 41 | # Package Files # 42 | *.jar 43 | *.war 44 | *.ear 45 | *.zip 46 | *.tar.gz 47 | *.rar 48 | 49 | # virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml 50 | hs_err_pid* 51 | .gitignore 52 | .idea/ 53 | rules/rules.iml 54 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Business Rules Engine Jboss Drools Integration with Apache Spark 2 | ![Apache Spark](img/spark_java_drools_maven.jpg) 3 | 4 | 5 | While working on a business rules integration project in Spark, I encountered various challenges in setting up the project structure efficiently. This includes bundling the project artifact, creating a launcher script, defining the Spark-submit script with classpath, establishing the Kie Rules and resource structure, and linking the business rules with the Spark app. In this guide, I aim to share the insights and strategies I've learned along the way. 6 | 7 | There are two maven modules, Keeping it separate so rules can be changed independently in runtime: 8 | 9 | - **Rules-core**- [drools rules](https://github.com/rahulsquid/spark-drools/tree/master/rules-core) 10 | - **Batch**- [Spark application](https://github.com/rahulsquid/spark-drools/tree/master/batch) 11 | 12 | ### Following are the major points covered in examples: 13 | 14 | - **Build Process and Launcher Module** : 15 | 16 | We use Maven for the entire build and artifact creation. Checkout the [Assembly](batch/src/main/assembly) and [Binary](batch/src/main/bin) in batch module to learn more about how to define the launcher script and bundle it into the artifact. 17 | 18 | 19 | - **How to build kie jar using maven project**- 20 | 21 | rules-core modules 22 | 23 | - **Declare new type using drool**- 24 | 25 | rules-core/src/main/resources/com.ts.sr.rules.MetaData.drl 26 | 27 | - **Avoid infinite loop execution when any property is updated**- 28 | 29 | rules/src/main/resources/com.ts.sr.rules.MetaData.drl 30 | - **Setup a Jboss drools and spark maven project**- 31 | batch module contains spark drools implementation 32 | - **Use Jboss Drools kieclasspathContainer in spark application**- 33 | 34 | batch/src/main/java/com.ts.blog.batch.CategoryAssignment.java 35 | - **Load kjar in spark application**- batch module 36 | 37 | - **Use spark transformation to execute rules**- batch module CategoryAssignment execute each rule and assignment Category to each record 38 | 39 | - **Kie Fluent api tutorial**- TODO 40 | 41 | 42 | ## Batch module launcher example 43 | 44 | /bin/launcher.sh --help 45 | Options category 'Startup': 46 | --[no]help [-h] (a boolean; default: "true") 47 | Print usage info 48 | --input [-i] (a string; default: "") 49 | input path 50 | --output [-o] (a string; default: "") 51 | output path 52 | 53 | 54 | ## You may use input-data directory for test run 55 | 56 | ## References: 57 | 58 | - [Kie Drools](https://www.drools.org/) 59 | - [Apache Spark](https://spark.apache.org/) 60 | - [Apache Maven](https://maven.apache.org/) 61 | -------------------------------------------------------------------------------- /batch/pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 5 | 6 | blog-parent 7 | com.ts.blog 8 | 1.0-SNAPSHOT 9 | 10 | 4.0.0 11 | 12 | blog-batch 13 | Blog::Spark Batch 14 | 15 | Application contains spark drools examples 16 | 17 | 18 | 19 | 20 | 21 | 22 | com.ts.blog 23 | blog-rules 24 | 25 | 26 | 27 | org.drools 28 | drools-core 29 | 30 | 31 | 32 | org.drools 33 | drools-compiler 34 | 35 | 36 | 37 | org.kie 38 | kie-api 39 | 40 | 41 | 42 | com.github.pcj 43 | google-options 44 | 45 | 46 | org.apache.spark 47 | spark-core_2.11 48 | 49 | 50 | org.apache.spark 51 | spark-sql_2.11 52 | 53 | 54 | 55 | org.testng 56 | testng 57 | 7.7.0 58 | test 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | org.apache.maven.plugins 67 | maven-surefire-plugin 68 | 2.21.0 69 | 70 | 71 | src/test/resources/testng.xml 72 | 73 | 74 | 75 | 76 | org.apache.maven.plugins 77 | maven-assembly-plugin 78 | 3.1.0 79 | 80 | 81 | src/main/assembly/build-assembly.xml 82 | 83 | 84 | 85 | 86 | make-assembly 87 | package 88 | 89 | 90 | single 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | -------------------------------------------------------------------------------- /batch/src/main/assembly/build-assembly.xml: -------------------------------------------------------------------------------- 1 | 3 | bin 4 | 5 | 6 | zip 7 | 8 | 9 | 10 | false 11 | lib 12 | false 13 | 14 | 15 | 16 | 17 | 18 | src/main/bin 19 | bin 20 | 21 | *.* 22 | 23 | 0755 24 | 25 | 26 | src/main/conf 27 | conf 28 | 29 | *.* 30 | 31 | 0755 32 | 33 | 34 | ${project.build.directory} 35 | 36 | 37 | *.jar 38 | 39 | 40 | 41 | -------------------------------------------------------------------------------- /batch/src/main/bin/launcher.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | if [[ -z "${SPARK_HOME}" ]]; then 4 | echo "SPARK_HOME is not defined" 5 | exit 1 6 | fi 7 | #Launch Assignment engine job 8 | export APP_HOME="$( cd "$(dirname "$0")" ; cd ..;pwd -P )" 9 | 10 | #export SPARK_HOME=/opt/mapr/spark/spark-2.1.0 11 | 12 | CLASSPATH="$(find ${APP_HOME}/lib/ -name '*.jar' -type f -print0| tr '\0' ',' )" 13 | 14 | APP_JAR="$(ls $APP_HOME/*.jar)" 15 | 16 | printf "APP_HOME:%s \nAPP_JAR=%s \nclasspath: %s" "$APP_HOME" "$APP_JAR" "$CLASSPATH" 17 | 18 | ${SPARK_HOME}/bin/spark-submit \ 19 | --class com.ts.blog.batch.CategoryAssignmentApp \ 20 | --master yarn \ 21 | --deploy-mode client \ 22 | --jars "$CLASSPATH" \ 23 | "${APP_JAR}" "$@" -------------------------------------------------------------------------------- /batch/src/main/conf/employee.csv: -------------------------------------------------------------------------------- 1 | id,name 2 | 100,xyz 3 | 200,prq -------------------------------------------------------------------------------- /batch/src/main/conf/employee.json: -------------------------------------------------------------------------------- 1 | { "id": 100, "name": "xyz"} 2 | {"id": 200,"name": "prq"} -------------------------------------------------------------------------------- /batch/src/main/conf/log4j.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 5 | 6 | 7 | 8 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | -------------------------------------------------------------------------------- /batch/src/main/java/com/ts/blog/batch/CategoryAssignmentApp.java: -------------------------------------------------------------------------------- 1 | package com.ts.blog.batch; 2 | 3 | import com.google.devtools.common.options.Option; 4 | import com.google.devtools.common.options.OptionsBase; 5 | import com.google.devtools.common.options.OptionsParser; 6 | import com.ts.blog.batch.functions.CategoryAssignment; 7 | import org.apache.spark.sql.Dataset; 8 | import org.apache.spark.sql.Row; 9 | import org.apache.spark.sql.SparkSession; 10 | 11 | import java.util.function.Function; 12 | 13 | /** 14 | * This Spark App shows how to connect run Kie rules as a part of Spark 15 | * @author Rahul 16 | * @since 1.0 17 | * 18 | */ 19 | public class CategoryAssignmentApp { 20 | 21 | public static void main(String[] args) { 22 | SparkSession session = SparkSession.builder().master("local[*]").appName("CategoryAssignmentApp"). 23 | getOrCreate(); 24 | 25 | //Parse supplied arguments 26 | Options options = new Options().argsParser.apply(args); 27 | 28 | 29 | //Read CSV file using input arguments 30 | Dataset ds = session.read().option("header", "true").csv(options.input); 31 | //See csv content 32 | ds.show(20); 33 | 34 | //convert dataset to javaRDD and use mapPartition to process each record using kie api and assign Category 35 | //Use mapPartitions function to process entire partition records, to avoid loading kie rules again and again for each record. 36 | 37 | ds.javaRDD().mapPartitions(new CategoryAssignment("com.ts.blog.kie", "Blog")) 38 | //Use save instead of collect if you have too many records. 39 | .collect().forEach(s-> System.out.println(s)); 40 | } 41 | 42 | 43 | /** 44 | * Options class to parse input arguments 45 | */ 46 | public static class Options extends OptionsBase{ 47 | public static OptionsParser parser = OptionsParser.newOptionsParser(Options.class); 48 | 49 | public static Function argsParser = (args) -> { 50 | parser.parseAndExitUponError(args); 51 | return parser.getOptions(Options.class); 52 | }; 53 | 54 | @Option( 55 | name="help", 56 | abbrev = 'h', 57 | category = "Startup", 58 | help = "Print usage info", 59 | defaultValue = "true" 60 | 61 | ) 62 | public boolean help; 63 | 64 | 65 | @Option( 66 | name="input", 67 | abbrev = 'i', 68 | category = "Startup", 69 | help = "input path", 70 | defaultValue = "" 71 | 72 | ) 73 | public String input; 74 | 75 | @Option( 76 | name="output", 77 | abbrev = 'o', 78 | help = "output path", 79 | category = "Startup", 80 | defaultValue = "" 81 | 82 | ) 83 | public String output; 84 | 85 | 86 | } 87 | } 88 | -------------------------------------------------------------------------------- /batch/src/main/java/com/ts/blog/batch/Launcher.java: -------------------------------------------------------------------------------- 1 | package com.ts.blog.batch; 2 | 3 | /** 4 | * 5 | */ 6 | public class Launcher { 7 | public static void main(String[] args) { 8 | CategoryAssignmentApp.main(new String[]{"-i", "/Users/rahul/git/spark-drools/input-data/blog.csv","-o", "/Users/rahul/git/spark-drools/input-data/"}); 9 | 10 | 11 | } 12 | } 13 | -------------------------------------------------------------------------------- /batch/src/main/java/com/ts/blog/batch/dataset/DataSetOps.java: -------------------------------------------------------------------------------- 1 | package com.ts.blog.batch.dataset; 2 | 3 | import org.apache.spark.SparkConf; 4 | import org.apache.spark.api.java.JavaRDD; 5 | import org.apache.spark.api.java.JavaSparkContext; 6 | import org.apache.spark.sql.*; 7 | 8 | import javax.xml.crypto.Data; 9 | 10 | /** 11 | * Created by rahul on 8/25/19. 12 | */ 13 | public class DataSetOps { 14 | 15 | 16 | public static void main(String[] args) { 17 | SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("Test"); 18 | SparkSession session = SparkSession.builder().config(conf).getOrCreate(); 19 | 20 | JavaSparkContext jsc = new JavaSparkContext(session.sparkContext()); 21 | 22 | String path= "/Users/rahul/git/spark-drools/batch/src/main/conf/employee"; 23 | 24 | //rdd(jsc, path); 25 | 26 | Dataset dataset = fromJson(session,path); 27 | dataset.show(); 28 | 29 | Row max = dataset.agg(org.apache.spark.sql.functions.max(dataset.col("id"))).as("max").head(); 30 | System.out.println(max); 31 | 32 | } 33 | 34 | private static JavaRDD rdd(JavaSparkContext jsc, String path) { 35 | return jsc.textFile(path+".csv") 36 | .map(row->row.split(",")) 37 | .map(s->new Employee(s)); 38 | } 39 | 40 | private static Dataset fromJson(SparkSession session, String path) { 41 | Encoder employeeEncoder = Encoders.bean(Employee.class); 42 | return session.read().json(path+".json").as(employeeEncoder); 43 | } 44 | } 45 | -------------------------------------------------------------------------------- /batch/src/main/java/com/ts/blog/batch/dataset/Employee.java: -------------------------------------------------------------------------------- 1 | package com.ts.blog.batch.dataset; 2 | 3 | /** 4 | * Created by rahul on 8/25/19. 5 | */ 6 | public class Employee { 7 | 8 | private Integer id; 9 | private String name; 10 | 11 | public Employee(){} 12 | 13 | public Employee(String[] fields){ 14 | id=Integer.getInteger(fields[0]); 15 | name=fields[1]; 16 | 17 | } 18 | @Override 19 | public boolean equals(Object o) { 20 | if (this == o) return true; 21 | if (o == null || getClass() != o.getClass()) return false; 22 | 23 | Employee employee = (Employee) o; 24 | 25 | return id.equals(employee.id); 26 | 27 | } 28 | 29 | @Override 30 | public int hashCode() { 31 | return id.hashCode(); 32 | } 33 | 34 | public Integer getId() { 35 | return id; 36 | } 37 | 38 | public void setId(Integer id) { 39 | this.id = id; 40 | } 41 | 42 | public String getName() { 43 | return name; 44 | } 45 | 46 | public void setName(String name) { 47 | this.name = name; 48 | } 49 | } 50 | -------------------------------------------------------------------------------- /batch/src/main/java/com/ts/blog/batch/functions/CategoryAssignment.java: -------------------------------------------------------------------------------- 1 | package com.ts.blog.batch.functions; 2 | 3 | import com.ts.blog.batch.CategoryAssignmentApp; 4 | import org.apache.logging.log4j.LogManager; 5 | import org.apache.logging.log4j.Logger; 6 | import org.apache.spark.api.java.function.FlatMapFunction; 7 | import org.apache.spark.sql.Row; 8 | import org.kie.api.KieBase; 9 | import org.kie.api.KieBaseConfiguration; 10 | import org.kie.api.KieServices; 11 | import org.kie.api.definition.type.FactField; 12 | import org.kie.api.definition.type.FactType; 13 | import org.kie.api.runtime.KieContainer; 14 | import org.kie.api.runtime.KieSession; 15 | 16 | import java.text.SimpleDateFormat; 17 | import java.util.ArrayList; 18 | import java.util.Date; 19 | import java.util.Iterator; 20 | import java.util.List; 21 | 22 | /** 23 | * 24 | * CategoryAssignment class converts each Row to kieFact (Blog) and assign Category based on the rules. 25 | * 26 | * @see KieSession 27 | * @see 28 | */ 29 | public class CategoryAssignment implements FlatMapFunction, Object> { 30 | 31 | private final String factPkg; 32 | private final String factClass; 33 | 34 | public CategoryAssignment(String factPkg,String factClass){ 35 | this.factPkg=factPkg; 36 | this.factClass=factClass; 37 | } 38 | @Override 39 | public Iterator call(Iterator rowIte) throws Exception { 40 | 41 | //Initlized kie base 42 | Logger LOGGER = LogManager.getLogger(CategoryAssignmentApp.class); 43 | KieServices kieServices = KieServices.Factory.get(); 44 | KieContainer kieContainer = kieServices.newKieClasspathContainer(); 45 | KieBaseConfiguration config = kieServices.newKieBaseConfiguration(); 46 | KieBase kieBase = kieContainer.newKieBase(config); 47 | 48 | 49 | //Get Blog type from kie base 50 | 51 | FactType type = kieBase.getFactType(factPkg, factClass); 52 | if (type == null) { 53 | throw new RuntimeException("Kie Fact not found: com.ts.blog.kie.Blog "); 54 | } 55 | List blogField = type.getFields(); 56 | //System.out.println(blogField); 57 | List output = new ArrayList(); 58 | //Iterate through each record and assign category 59 | while (rowIte.hasNext()) { 60 | Row row = rowIte.next(); 61 | Object blog = type.newInstance(); 62 | /** 63 | * Populate each field value to object 64 | */ 65 | blogField.forEach(field -> { 66 | try { 67 | if (field.getIndex() < row.size()) { 68 | String value = row.getString(field.getIndex()); 69 | Object finalVal = null; 70 | //Based on value class type process it 71 | if (field.getType().equals(Date.class)) { 72 | finalVal = new SimpleDateFormat("MM/dd/yyyy").parse(value); 73 | } else if (field.getType().equals(Integer.class)) { 74 | finalVal = Integer.valueOf(value); 75 | } else { 76 | finalVal = value; 77 | } 78 | if (finalVal != null) { 79 | type.set(blog, field.getName(), finalVal); 80 | } 81 | } 82 | } catch (Exception e) { 83 | e.printStackTrace(); 84 | } 85 | }); 86 | 87 | /** 88 | * Fetch Kie session from kiebase 89 | */ 90 | KieSession kieSession = kieBase.newKieSession(); 91 | kieSession.insert(blog); 92 | kieSession.setGlobal("LOGGER", LOGGER); 93 | //All set to fire rules 94 | kieSession.fireAllRules(); 95 | kieSession.dispose(); 96 | output.add(blog); 97 | } 98 | return output.iterator(); 99 | } 100 | } 101 | 102 | -------------------------------------------------------------------------------- /batch/src/test/resources/testng.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | -------------------------------------------------------------------------------- /img/spark_java_drools_maven.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rahulcodewiz/spark-drools/b5167b60d2c596d44fd0ca6ba8a793571078296c/img/spark_java_drools_maven.jpg -------------------------------------------------------------------------------- /input-data/blog.csv: -------------------------------------------------------------------------------- 1 | topic,sub-topic,likes,time,visits 2 | java,maven,1000,1/1/2016,50 3 | unix,unix,100,1/1/2010,100 4 | java,ant,250,1/1/2010,20 5 | java,commons,0,1/1/2008,100 -------------------------------------------------------------------------------- /pom.xml: -------------------------------------------------------------------------------- 1 | 3 | 4.0.0 4 | com.ts.blog 5 | blog-parent 6 | 1.0-SNAPSHOT 7 | pom 8 | 9 | Blog::Parent 10 | 11 | http://www.techsquids.com/ 12 | 13 | 14 | rules-core 15 | batch 16 | 17 | 18 | 19 | UTF-8 20 | 1.8 21 | 1.8 22 | 7.7.0.FINAL 23 | 24 | 25 | 26 | 27 | 28 | 29 | org.apache.logging.log4j 30 | log4j-api 31 | 2.16.0 32 | 33 | 34 | org.apache.logging.log4j 35 | log4j-core 36 | 2.17.1 37 | 38 | 39 | 40 | junit 41 | junit 42 | 4.13.1 43 | test 44 | 45 | 46 | 47 | 48 | org.drools 49 | drools-core 50 | 7.69.0.Final 51 | 52 | 53 | org.drools 54 | drools-compiler 55 | 7.7.0.FINAL 56 | 57 | 58 | org.kie 59 | kie-api 60 | 7.7.0.FINAL 61 | 62 | 63 | 64 | 65 | com.ts.blog 66 | blog-rules 67 | ${project.version} 68 | 69 | 70 | 71 | 72 | 73 | 74 | org.apache.spark 75 | spark-core_2.11 76 | 2.3.1 77 | 78 | 79 | org.apache.spark 80 | spark-sql_2.11 81 | 2.3.1 82 | 83 | 84 | com.github.pcj 85 | google-options 86 | 1.0.0 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | org.kie 97 | kie-maven-plugin 98 | 7.7.0.Final 99 | true 100 | 101 | 102 | maven-clean-plugin 103 | 3.0.0 104 | 105 | 106 | 107 | maven-resources-plugin 108 | 3.0.2 109 | 110 | 111 | maven-compiler-plugin 112 | 3.7.0 113 | 114 | 115 | maven-surefire-plugin 116 | 2.20.1 117 | 118 | 119 | maven-jar-plugin 120 | 3.0.2 121 | 122 | 123 | maven-install-plugin 124 | 2.5.2 125 | 126 | 127 | maven-deploy-plugin 128 | 2.8.2 129 | 130 | 131 | 132 | 133 | 134 | -------------------------------------------------------------------------------- /rules-core/kie-rules.iml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | -------------------------------------------------------------------------------- /rules-core/pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 5 | 6 | 7 | com.ts.blog 8 | blog-parent 9 | 1.0-SNAPSHOT 10 | ../pom.xml 11 | 12 | 13 | 4.0.0 14 | 15 | com.ts.blog 16 | blog-rules 17 | 18 | Blog::Rules 19 | 20 | Module contains various examples of drools rules 21 | 22 | kjar 23 | 24 | 25 | 26 | 27 | 28 | org.kie 29 | kie-api 30 | 31 | 32 | org.drools 33 | drools-core 34 | 35 | 36 | org.drools 37 | drools-compiler 38 | 39 | 40 | 41 | org.apache.logging.log4j 42 | log4j-api 43 | 44 | 45 | 46 | org.apache.logging.log4j 47 | log4j-core 48 | 49 | 50 | junit 51 | junit 52 | test 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | org.apache.maven.plugins 61 | maven-compiler-plugin 62 | 63 | 1.8 64 | 1.8 65 | 66 | 67 | 68 | org.kie 69 | kie-maven-plugin 70 | true 71 | 72 | 73 | 74 | 75 | -------------------------------------------------------------------------------- /rules-core/src/main/java/com/ts/blog/rules/KieFunctions.java: -------------------------------------------------------------------------------- 1 | package com.ts.blog.rules; 2 | 3 | import java.text.ParseException; 4 | import java.text.SimpleDateFormat; 5 | import java.util.Date; 6 | import java.util.Objects; 7 | import org.apache.logging.log4j.LogManager; 8 | import org.apache.logging.log4j.Logger; 9 | 10 | /** 11 | */ 12 | public class KieFunctions { 13 | private static final Logger logger = LogManager.getLogger(KieFunctions.class); 14 | public static boolean dateCompare(Date dt, String year){ 15 | try { 16 | 17 | Objects.requireNonNull(dt,"dt cant be null"); 18 | Objects.requireNonNull(year,"dt cant be null"); 19 | return new SimpleDateFormat("yyyy").parse(year).compareTo(dt)>0; 20 | } catch (Exception e) { 21 | return false; 22 | } 23 | } 24 | } 25 | -------------------------------------------------------------------------------- /rules-core/src/main/java/com/ts/blog/rules/type/Category.java: -------------------------------------------------------------------------------- 1 | package com.ts.blog.rules.type; 2 | 3 | /** 4 | * 5 | * Blog category 6 | * @since 1.0 7 | */ 8 | public enum Category { 9 | NOTABLE,BRONZE, SILVER, GOLD, PLATINUM, POPULAR, PIONEER, CLEANUP 10 | } 11 | -------------------------------------------------------------------------------- /rules-core/src/main/resources/META-INF/kmodule.xml: -------------------------------------------------------------------------------- 1 | 3 | 4 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | -------------------------------------------------------------------------------- /rules-core/src/main/resources/com/ts/blog/kie/BlogCategory.drl: -------------------------------------------------------------------------------- 1 | package com.ts.blog.rules 2 | 3 | import org.apache.logging.log4j.Logger 4 | import com.ts.blog.kie.Blog 5 | import com.ts.blog.rules.type.Category 6 | import static com.ts.blog.rules.KieFunctions.* 7 | 8 | global Logger LOGGER 9 | 10 | rule "cat_top1000" 11 | salience 1000 12 | no-loop true 13 | dialect "java" 14 | when 15 | $blog : Blog( dateCompare(time,"2010") == true ) 16 | then 17 | LOGGER.info("cat_top obj:%s",$blog); 18 | modify( $blog ){setCategory(Category.CLEANUP)}; 19 | end 20 | 21 | 22 | rule "cat_top1" 23 | salience 100 24 | no-loop true 25 | dialect "java" 26 | when 27 | $blog : Blog ( likes > 100 && topic.equals("java") ) 28 | then 29 | LOGGER.info("cat_top obj:{}",$blog); 30 | modify( $blog ){setCategory(Category.BRONZE)}; 31 | end 32 | 33 | rule "cat_top2" 34 | salience 101 35 | no-loop true 36 | dialect "java" 37 | when 38 | $blog : Blog( likes > 200 && topic.equals("java") ) 39 | then 40 | LOGGER.info("cat_top obj:{}",$blog); 41 | modify( $blog ){setCategory(Category.SILVER)}; 42 | end 43 | 44 | rule "cat_top3" 45 | salience 102 46 | no-loop true 47 | dialect "java" 48 | when 49 | $blog : Blog( likes > 100 && topic.equals("unix") ) 50 | then 51 | LOGGER.info("cat_top obj:{}",$blog); 52 | modify( $blog ){setCategory(Category.GOLD)}; 53 | end 54 | 55 | rule "cat_top4" 56 | salience 103 57 | no-loop true 58 | dialect "java" 59 | when 60 | $blog : Blog( likes > 400 && likes < 600 && subtopic.equals("unix") || topic.equals("unix") ) 61 | then 62 | LOGGER.info("cat_top obj:{}",$blog); 63 | modify( $blog ){setCategory(Category.GOLD)}; 64 | end 65 | 66 | 67 | rule "cat_top6" 68 | salience 1 69 | no-loop true 70 | dialect "java" 71 | when 72 | $blog : Blog( likes > 600 && topic.equals("unix") ) 73 | then 74 | LOGGER.info("cat_top obj:{}",$blog); 75 | modify( $blog ){setCategory(Category.PLATINUM)}; 76 | end -------------------------------------------------------------------------------- /rules-core/src/main/resources/com/ts/blog/kie/MetaData.drl: -------------------------------------------------------------------------------- 1 | package com.ts.blog.kie 2 | 3 | import com.ts.blog.rules.type.Category 4 | 5 | /** 6 | * TechSquids blog model declaration 7 | * position annotation is optional 8 | */ 9 | 10 | declare Blog 11 | 12 | @author( Rahul ) 13 | @dateOfCreation( 02-June-2018 ) 14 | 15 | //Adding annotation to avoid infinite loop execution when any property is updated 16 | @propertyReactive 17 | topic : String 18 | subtopic:String 19 | likes : Integer 20 | time : java.util.Date 21 | visits : Integer 22 | category : Category 23 | rating : Integer 24 | end --------------------------------------------------------------------------------