Versions Compared
Key
- This line was added.
- This line was removed.
- Formatting was changed.
Introduction
Spark plugins that trains and classify data based on Multinomial/Binary Logistic Regression.
Use-case
A manager wants to predict if the customer will give the tip or not based on some features of hotel food order data.
For above purpose, he wants to train a tip classifier based on order data feed using the starter, dessert as features and tip labeled as tip provided or not.
Label → tip or no tip- 1.0 in case of tip and 0.0 otherwise.
Feature → {Starter, Dessert}
User Stories
- User should be able to train the data.
- User should be able to classify the test data using the model build while training the data.
- User should be able to provide the list of columns(features) to use for training.
- User should be able to provide the list of columns(features) to classify.
- User should be able to provide the column to be used as prediction field while training/classification.
- User should be able to provide the number of features to be used while training/classification.
- User should be able to provide the number of classes to be used while training/classification.
- User should be able to provide the file set name to save the training model.
- User should be able to provide the path of the file set.
Example
Suppose the Trainer plugin gets below records to train the Logistic Regression Model:
Starter | Dessert | Tip |
---|---|---|
1 | 0 | 0.0 |
1 | 1 | 1.0 |
0 | 1 | 0.0 |
0 | 0 | 0.0 |
Trained on the above records, trainer plugin will save the model in a fileSet, which will be used later for predicting the tip value using Logistic regression classifier.
Implementation Tips
Design
Logistic Regression Trainer
Input Json Format
Code Block | ||||
---|---|---|---|---|
| ||||
{ "name": "LogisticRegressionTrainer", "type": "sparksink", "properties": { "fileSetName": "logical-regression-model", "path": "/home/cdap/model", "featureFields": "Starter,Dessert", "labelField": "Tip", "numFeatures": "2", "numClasses": "2" } } |
Plugin will take above inputs from user and trains the model based on "featureFields" and "labelField" fields as features and label points respectively.
Properties:
- fileSetName: The name of the FileSet to save the model to.
- path: Path of the FileSet to save the model to.
- featureFields: A comma-separated sequence of field names to used as features for training.
- labelField: It should be the column name from input structure record containing the data to be treated as label for prediction.
- numFeatures: The number of features to be used in HashingTF to generate features from string fields.
- numClasses: The number of classes to be used in training model. It should be of type integer.
The model generated from this plugin will further be used by Logistic Regression Classifier plugin to classify the input data.
Logistic Regression Classifier
Input Json Format
Code Block | ||||
---|---|---|---|---|
| ||||
{ "name": "LogisticRegressionClassifier", "type": "sparkcompute", "properties": { "fileSetName": "logical-regression-model", "path": "/home/cdap/model", "fieldsToClassify": "Starter,Dessert", "predictionField": "Tip", "numFeatures": "2" } } |
Classifier plugin will take above inputs from user and the logistic regression model from the "fileSetName" and classify the order data whether the customer will give the tip or not.
Properties:
- fileSetName: The name of the FileSet model.
- path: Path of the FileSet from which model needs to be retrieved.
- featureFieldsfieldsToClassify: A comma-separated sequence of field names to used as features for classification.
- labelFieldpredictionField: It should be the column name from input structure record containing the data to be treated as label for predictionin which the prediction data needs to be saved.
- numFeatures: The number of features to be used in HashingTF to generate features from string fields.numClasses: The number of classes to be used in training model. It should be of type integer.
Table of Contents
Table of Contents style circle
Checklist
- User stories documented
- User stories reviewed
- Design documented
- Design reviewed
- Feature merged
- Examples and guides
- Integration tests
- Documentation for feature
- Short video demonstrating the feature