020_PowerPlantPipeline_02ModelTuneEvaluate(Scala)

Archived YouTube video of this live unedited lab-lecture:

Archived YouTube video of this live unedited lab-lecture

Power Plant ML Pipeline Application

This is an end-to-end example of using a number of different machine learning algorithms to solve a supervised regression problem.

Table of Contents

  • Step 1: Business Understanding
  • Step 2: Load Your Data
  • Step 3: Explore Your Data
  • Step 4: Visualize Your Data
  • Step 5: Data Preparation
  • Step 6: Data Modeling
  • Step 7: Tuning and Evaluation
  • Step 8: Deployment

We are trying to predict power output given a set of readings from various sensors in a gas-fired power generation plant. Power generation is a complex process, and understanding and predicting power output is an important element in managing a plant and its connection to the power grid.

More information about Peaker or Peaking Power Plants can be found on Wikipedia https://en.wikipedia.org/wiki/Peaking_power_plant

Given this business problem, we need to translate it to a Machine Learning task. The ML task is regression since the label (or target) we are trying to predict is numeric.

The example data is provided by UCI at UCI Machine Learning Repository Combined Cycle Power Plant Data Set

You can read the background on the UCI page, but in summary we have collected a number of readings from sensors at a Gas Fired Power Plant

(also called a Peaker Plant) and now we want to use those sensor readings to predict how much power the plant will generate.

More information about Machine Learning with Spark can be found in the Spark MLLib Programming Guide

Please note this example only works with Spark version 1.4 or higher



To Rerun Steps 1-4 done in the notebook at:

just run the following command as shown in the cell below:

  %run "/scalable-data-science/sds-2-x/009_PowerPlantPipeline_01ETLEDA"
  • Note: If you already evaluated the %run ... command above then:

    • first delete the cell by pressing on x on the top-right corner of the cell and
    • revaluate the run command above.
%run "/scalable-data-science/sds-2-x/009_PowerPlantPipeline_01ETLEDA"
res2: Int = 240
dbfs:/databricks-datasets/power-plant/data/Sheet1.tsvSheet1.tsv308693
dbfs:/databricks-datasets/power-plant/data/Sheet2.tsvSheet2.tsv308693
dbfs:/databricks-datasets/power-plant/data/Sheet3.tsvSheet3.tsv308693
dbfs:/databricks-datasets/power-plant/data/Sheet4.tsvSheet4.tsv308693
dbfs:/databricks-datasets/power-plant/data/Sheet5.tsvSheet5.tsv308693
powerPlantRDD: org.apache.spark.rdd.RDD[String] = /databricks-datasets/power-plant/data/Sheet1.tsv MapPartitionsRDD[10575] at textFile at command-45638284503054:1
AT V AP RH PE 14.96 41.76 1024.07 73.17 463.26 25.18 62.96 1020.04 59.08 444.37 5.11 39.4 1012.16 92.14 488.56 20.86 57.32 1010.24 76.64 446.48
powerPlantDF: org.apache.spark.sql.DataFrame = [AT: double, V: double ... 3 more fields]
root |-- AT: double (nullable = true) |-- V: double (nullable = true) |-- AP: double (nullable = true) |-- RH: double (nullable = true) |-- PE: double (nullable = true)
res9: Long = 9568
+-----+-----+-------+-----+------+ | AT| V| AP| RH| PE| +-----+-----+-------+-----+------+ |14.96|41.76|1024.07|73.17|463.26| |25.18|62.96|1020.04|59.08|444.37| | 5.11| 39.4|1012.16|92.14|488.56| |20.86|57.32|1010.24|76.64|446.48| |10.82| 37.5|1009.23|96.62| 473.9| |26.27|59.44|1012.23|58.77|443.67| |15.89|43.96|1014.02|75.24|467.35| | 9.48|44.71|1019.12|66.43|478.42| |14.64| 45.0|1021.78|41.25|475.98| |11.74|43.56|1015.14|70.72| 477.5| +-----+-----+-------+-----+------+ only showing top 10 rows
PE1.00k1.01k1.02k1.03k10.020.030.0100RHAP1.00k50.0V40.060.080.010040.060.0AT

Showing sample based on the first 1000 rows.

res12: Long = 9568
+--------+--------------------+-----------+ |database| tableName|isTemporary| +--------+--------------------+-----------+ | default| adult| false| | default| business_csv_csv| false| | default| checkin_table| false| | default| diamonds| false| | default| inventory| false| | default|item_merchant_cat...| false| | default| items_left_csv| false| | default| logistic_detail| false| | default| merchant_ratings| false| | default| order_data| false| | default| order_ids_left_csv| false| | default| repeat_csv| false| | default| review_2019_csv| false| | default|sample_logistic_t...| false| | default| sentimentlex_csv| false| | default| simple_range| false| | default| social_media_usage| false| | default| tip_json| false| | default| tips_csv_csv| false| | default| users_csv| false| +--------+--------------------+-----------+
+------------------------+--------+-----------+---------+-----------+ |name |database|description|tableType|isTemporary| +------------------------+--------+-----------+---------+-----------+ |adult |default |null |EXTERNAL |false | |business_csv_csv |default |null |EXTERNAL |false | |checkin_table |default |null |MANAGED |false | |diamonds |default |null |EXTERNAL |false | |inventory |default |null |MANAGED |false | |item_merchant_categories|default |null |MANAGED |false | |items_left_csv |default |null |EXTERNAL |false | |logistic_detail |default |null |MANAGED |false | |merchant_ratings |default |null |MANAGED |false | |order_data |default |null |MANAGED |false | |order_ids_left_csv |default |null |EXTERNAL |false | |repeat_csv |default |null |MANAGED |false | |review_2019_csv |default |null |EXTERNAL |false | |sample_logistic_table |default |null |EXTERNAL |false | |sentimentlex_csv |default |null |EXTERNAL |false | |simple_range |default |null |MANAGED |false | |social_media_usage |default |null |MANAGED |false | |tip_json |default |null |EXTERNAL |false | |tips_csv_csv |default |null |EXTERNAL |false | |users_csv |default |null |EXTERNAL |false | +------------------------+--------+-----------+---------+-----------+
+---------+---------------------+--------------------------------------+ |name |description |locationUri | +---------+---------------------+--------------------------------------+ |db_ad_gcs| |dbfs:/user/hive/warehouse/db_ad_gcs.db| |default |Default Hive database|dbfs:/user/hive/warehouse | +---------+---------------------+--------------------------------------+
+--------+--------------------+-----------+ |database| tableName|isTemporary| +--------+--------------------+-----------+ | default| adult| false| | default| business_csv_csv| false| | default| checkin_table| false| | default| diamonds| false| | default| inventory| false| | default|item_merchant_cat...| false| | default| items_left_csv| false| | default| logistic_detail| false| | default| merchant_ratings| false| | default| order_data| false| | default| order_ids_left_csv| false| | default| repeat_csv| false| | default| review_2019_csv| false| | default|sample_logistic_t...| false| | default| sentimentlex_csv| false| | default| simple_range| false| | default| social_media_usage| false| | default| tip_json| false| | default| tips_csv_csv| false| | default| users_csv| false| +--------+--------------------+-----------+ only showing top 20 rows
14.9641.761024.0773.17463.26
25.1862.961020.0459.08444.37
5.1139.41012.1692.14488.56
20.8657.321010.2476.64446.48
10.8237.51009.2396.62473.9
26.2759.441012.2358.77443.67
15.8943.961014.0275.24467.35
9.4844.711019.1266.43478.42
14.64451021.7841.25475.98
11.7443.561015.1470.72477.5
17.9943.721008.6475.04453.02
20.1446.931014.6664.22453.99
24.3473.51011.3184.15440.29
25.7158.591012.7761.83451.28
26.1969.341009.4887.59433.99
21.4243.791015.7643.08462.19
18.21451022.8648.84467.54
11.0441.741022.677.51477.2
14.4552.751023.9763.59459.85
13.9738.471015.1555.28464.3
17.7642.421009.0966.26468.27
5.4140.071019.1664.77495.24
7.7642.281008.5283.31483.8
27.2363.91014.347.19443.61
27.3648.61003.1854.93436.06
27.4770.721009.9774.62443.25
14.639.311011.1172.52464.16
7.9139.961023.5788.44475.52
5.8135.791012.1492.28484.41
30.5365.181012.6941.85437.89
23.8763.941019.0244.28445.11
26.0958.411013.6464.58438.86
29.2766.851011.1163.25440.98
27.3874.161010.0878.61436.65
24.8163.941018.7644.51444.26
12.7544.031007.2989.46465.86
24.6663.731011.474.52444.37
16.3847.451010.0888.86450.69
13.9139.351014.6975.51469.02
23.1851.31012.0478.64448.86
22.4747.451007.6276.65447.14
13.3944.851017.2480.44469.18
9.2841.541018.3379.89482.8
11.8242.861014.1288.28476.7
10.2740.641020.6384.6474.99
22.9263.941019.2842.69444.22
1637.871020.2478.41461.33
21.2243.431010.9661.07448.06
13.4644.711014.5150474.6
9.3940.111029.1477.29473.05
31.0773.51010.5843.66432.06
12.8238.621018.7183.8467.41
32.5778.921011.666.47430.12
8.1142.181014.8293.09473.62
13.9239.391012.9480.52471.81
23.0459.431010.2368.99442.99
27.3164.441014.6557.27442.77
5.9139.331010.1895.53491.49
25.2661.081013.6871.72447.46
27.9758.841002.2557.88446.11
26.0852.31007.0363.34442.44
29.0165.711013.6148.07446.22
12.1840.11016.6791.87471.49
13.7645.871008.8987.27463.5
25.558.791016.0264.4440.01
28.2665.341014.5643.4441.03
21.3962.961019.4972.24452.68
7.2640.691020.4390.22474.91
10.5434.031018.7174478.77
27.7174.34998.1471.85434.2
23.1168.31017.8386.62437.91
7.5141.011024.6197.41477.61
26.4674.671016.6584.44431.65
29.3474.34998.5881.55430.57
10.3242.281008.8275.66481.09
22.7461.021009.5679.41445.56
13.4839.851012.7158.91475.74
25.5269.751010.3690.06435.12
21.5867.251017.3979446.15
27.6676.861001.3169.47436.64
26.9669.451013.8951.47436.69
12.2942.181016.5383.13468.75
15.8643.021012.1840.33466.6
13.8745.081024.4281.69465.48
24.0973.681014.9394.55441.34
20.4569.451012.5391.81441.83
15.0739.3101963.62464.7
32.7269.751009.649.35437.99
18.2358.961015.5569.61459.12
35.5668.941006.5638.75429.69
18.3651.431010.5790.17459.8
26.3564.051009.8181.24433.63
25.9260.951014.6248.46442.84
8.0141.661014.4976.72485.13
19.6352.721025.0951.16459.12
20.0267.321012.0576.34445.31
10.0840.721022.767.3480.8
27.2366.481005.2352.38432.55
23.3763.771013.4276.44443.86
18.7459.211018.391.55449.77

Showing the first 1000 rows.

ATdoublenull
Vdoublenull
APdoublenull
RHdoublenull
PEdoublenull
count95689568956895689568
mean19.6512311872910254.305803720736011013.259078177260373.30897784280926454.3650094063554
stddev7.452473229611082512.7078929983267845.93878370581158114.60026875672896417.066994999803402
min1.8125.36992.8925.56420.26
max37.1181.561033.3100.16495.76
4304354404454504554604654704754804854904955.0010.015.020.025.030.035.0TemperaturePower

Showing sample based on the first 1000 rows.

43043544044545045546046547047548048549049530.040.050.060.070.0ExhaustVaccumPower

Showing sample based on the first 1000 rows.

4304354404454504554604654704754804854904951.00k1.01k1.02k1.03kPressurePower

Showing sample based on the first 1000 rows.

43043544044545045546046547047548048549049540.060.080.0100HumidityPower

Showing sample based on the first 1000 rows.

PE1.00k1.01k1.02k1.03k10.020.030.0100RHAP1.00k50.0V40.060.080.010040.060.0AT

Showing sample based on the first 1000 rows.



Now we will do the following Steps:

Step 5: Data Preparation,

Step 6: Modeling, and

Step 7: Tuning and Evaluation

We will do Step 8: Deployment later after we get introduced to SparkStreaming.

Step 5: Data Preparation

The next step is to prepare the data. Since all of this data is numeric and consistent, this is a simple task for us today.

We will need to convert the predictor features from columns to Feature Vectors using the org.apache.spark.ml.feature.VectorAssembler

The VectorAssembler will be the first step in building our ML pipeline.

//Let's quickly recall the schema and make sure our table is here now
table("power_plant_table").printSchema
root |-- AT: double (nullable = true) |-- V: double (nullable = true) |-- AP: double (nullable = true) |-- RH: double (nullable = true) |-- PE: double (nullable = true)
powerPlantDF // make sure we have the DataFrame too
res24: org.apache.spark.sql.DataFrame = [AT: double, V: double ... 3 more fields]
import org.apache.spark.ml.feature.VectorAssembler

// make a DataFrame called dataset from the table
val dataset = sqlContext.table("power_plant_table") 

val vectorizer =  new VectorAssembler()
                      .setInputCols(Array("AT", "V", "AP", "RH"))
                      .setOutputCol("features")

import org.apache.spark.ml.feature.VectorAssembler dataset: org.apache.spark.sql.DataFrame = [AT: double, V: double ... 3 more fields] vectorizer: org.apache.spark.ml.feature.VectorAssembler = vecAssembler_a6baa233f655

Step 6: Data Modeling

Now let's model our data to predict what the power output will be given a set of sensor readings

Our first model will be based on simple linear regression since we saw some linear patterns in our data based on the scatter plots during the exploration stage.

Linear Regression Model

  • Linear Regression is one of the most useful work-horses of statistical learning
  • See Chapter 7 of Kevin Murphy's Machine Learning froma Probabilistic Perspective for a good mathematical and algorithmic introduction.
  • You should have already seen Ameet's treatment of the topic from earlier notebook.
// First let's hold out 20% of our data for testing and leave 80% for training
var Array(split20, split80) = dataset.randomSplit(Array(0.20, 0.80), 1800009193L)
split20: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [AT: double, V: double ... 3 more fields] split80: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [AT: double, V: double ... 3 more fields]
// Let's cache these datasets for performance
val testSet = split20.cache()
val trainingSet = split80.cache()
testSet: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [AT: double, V: double ... 3 more fields] trainingSet: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [AT: double, V: double ... 3 more fields]
testSet.count() // action to actually cache
res29: Long = 1966
trainingSet.count() // action to actually cache
res30: Long = 7602

Let's take a few elements of the three DataFrames.

dataset.take(3)
res31: Array[org.apache.spark.sql.Row] = Array([14.96,41.76,1024.07,73.17,463.26], [25.18,62.96,1020.04,59.08,444.37], [5.11,39.4,1012.16,92.14,488.56])
testSet.take(3)
res32: Array[org.apache.spark.sql.Row] = Array([1.81,39.42,1026.92,76.97,490.55], [3.2,41.31,997.67,98.84,489.86], [3.38,41.31,998.79,97.76,489.11])
trainingSet.take(3)
res33: Array[org.apache.spark.sql.Row] = Array([2.34,39.42,1028.47,69.68,490.34], [2.58,39.42,1028.68,69.03,488.69], [2.64,39.64,1011.02,85.24,481.29])
// ***** LINEAR REGRESSION MODEL ****

import org.apache.spark.ml.regression.LinearRegression
import org.apache.spark.ml.regression.LinearRegressionModel
import org.apache.spark.ml.Pipeline

// Let's initialize our linear regression learner
val lr = new LinearRegression()
import org.apache.spark.ml.regression.LinearRegression import org.apache.spark.ml.regression.LinearRegressionModel import org.apache.spark.ml.Pipeline lr: org.apache.spark.ml.regression.LinearRegression = linReg_ba1ada380272
// We use explain params to dump the parameters we can use
lr.explainParams()
res34: String = aggregationDepth: suggested depth for treeAggregate (>= 2) (default: 2) elasticNetParam: the ElasticNet mixing parameter, in range [0, 1]. For alpha = 0, the penalty is an L2 penalty. For alpha = 1, it is an L1 penalty (default: 0.0) epsilon: The shape parameter to control the amount of robustness. Must be > 1.0. (default: 1.35) featuresCol: features column name (default: features) fitIntercept: whether to fit an intercept term (default: true) labelCol: label column name (default: label) loss: The loss function to be optimized. Supported options: squaredError, huber. (Default squaredError) (default: squaredError) maxIter: maximum number of iterations (>= 0) (default: 100) predictionCol: prediction column name (default: prediction) regParam: regularization parameter (>= 0) (default: 0.0) solver: The solver algorithm for optimization. Supported options: auto, normal, l-bfgs. (Default auto) (default: auto) standardization: whether to standardize the training features before fitting the model (default: true) tol: the convergence tolerance for iterative algorithms (>= 0) (default: 1.0E-6) weightCol: weight column name. If this is not set or empty, we treat all instance weights as 1.0 (undefined)

The cell below is based on the Spark ML pipeline API. More information can be found in the Spark ML Programming Guide at https://spark.apache.org/docs/latest/ml-guide.html

// Now we set the parameters for the method
lr.setPredictionCol("Predicted_PE")
  .setLabelCol("PE")
  .setMaxIter(100)
  .setRegParam(0.1)
// We will use the new spark.ml pipeline API. If you have worked with scikit-learn this will be very familiar.
val lrPipeline = new Pipeline()
lrPipeline.setStages(Array(vectorizer, lr))
// Let's first train on the entire dataset to see what we get
val lrModel = lrPipeline.fit(trainingSet)
lrPipeline: org.apache.spark.ml.Pipeline = pipeline_97dfd6f4e2e5 lrModel: org.apache.spark.ml.PipelineModel = pipeline_97dfd6f4e2e5

Since Linear Regression is simply a line of best fit over the data that minimizes the square of the error, given multiple input dimensions we can express each predictor as a line function of the form:

y=b0+b1x1+b2x2+b3x3+…+bixi+…+bkxk y = b_0 + b_1 x_1 + b_2 x_2 + b_3 x_3 + \ldots + b_i x_i + \ldots + b_k x_k

where b0b_0 is the intercept and bib_i's are coefficients.

To express the coefficients of that line we can retrieve the Estimator stage from the fitted, linear-regression pipeline model named lrModel and express the weights and the intercept for the function.

// The intercept is as follows:
val intercept = lrModel.stages(1).asInstanceOf[LinearRegressionModel].intercept
intercept: Double = 427.9139822165837
// The coefficents (i.e. weights) are as follows:

val weights = lrModel.stages(1).asInstanceOf[LinearRegressionModel].coefficients.toArray
weights: Array[Double] = Array(-1.9083064919040942, -0.25381293007161654, 0.08739350304730673, -0.1474651301033126)

The model has been fit and the intercept and coefficients displayed above.

Now, let us do some work to make a string of the model that is easy to understand for an applied data scientist or data analyst.

val featuresNoLabel = dataset.columns.filter(col => col != "PE")
featuresNoLabel: Array[String] = Array(AT, V, AP, RH)
val coefficentFeaturePairs = sc.parallelize(weights).zip(sc.parallelize(featuresNoLabel))
coefficentFeaturePairs: org.apache.spark.rdd.RDD[(Double, String)] = ZippedPartitionsRDD2[10674] at zip at command-1805207615647213:1
coefficentFeaturePairs.collect() // this just pairs each coefficient with the name of its corresponding feature
res35: Array[(Double, String)] = Array((-1.9083064919040942,AT), (-0.25381293007161654,V), (0.08739350304730673,AP), (-0.1474651301033126,RH))
// Now let's sort the coefficients from the largest to the smallest

var equation = s"y = $intercept "
//var variables = Array
coefficentFeaturePairs.sortByKey().collect().foreach({
  case (weight, feature) =>
  { 
        val symbol = if (weight > 0) "+" else "-"
        val absWeight = Math.abs(weight)
        equation += (s" $symbol (${absWeight} * ${feature})")
  }
}
)
equation: String = y = 427.9139822165837 - (1.9083064919040942 * AT) - (0.25381293007161654 * V) - (0.1474651301033126 * RH) + (0.08739350304730673 * AP)
// Finally here is our equation
println("Linear Regression Equation: " + equation)
Linear Regression Equation: y = 427.9139822165837 - (1.9083064919040942 * AT) - (0.25381293007161654 * V) - (0.1474651301033126 * RH) + (0.08739350304730673 * AP)

Based on examining the fitted Linear Regression Equation above:

  • There is a strong negative correlation between Atmospheric Temperature (AT) and Power Output due to the coefficient being greater than -1.91.
  • But our other dimenensions seem to have little to no correlation with Power Output.

Do you remember Step 2: Explore Your Data? When we visualized each predictor against Power Output using a Scatter Plot, only the temperature variable seemed to have a linear correlation with Power Output so our final equation seems logical.

Now let's see what our predictions look like given this model.

val predictionsAndLabels = lrModel.transform(testSet)

display(predictionsAndLabels.select("AT", "V", "AP", "RH", "PE", "Predicted_PE"))
1.8139.421026.9276.97490.55492.8503868481024
3.241.31997.6798.84489.86483.9368120270272
3.3841.31998.7997.76489.11483.850459922409
3.439.641011.183.43459.86487.4251507226833
3.5135.471017.5386.56489.07488.37401129434335
3.6338.441016.1687.38487.87487.1505396071426
3.9135.471016.9286.03488.67487.6355351796776
3.9439.91008.0697.49488.81483.9896378767201
439.91009.6497.16490.79484.0618847149547
4.1539.91007.6295.69489.8483.8158776062654
4.1539.91008.8496.68491.22483.77650720118083
4.2338.441016.4676.64489487.61554926022393
4.2439.91009.2896.74491.25483.6343648504441
4.4338.911019.0488.17491.9485.6397981724803
4.4438.441016.1475.35486.53487.37706899378225
4.6140.271012.3277.28492.85485.96972834538735
4.6535.191018.2394.78489.36485.11862159667663
4.6939.421024.5879.35486.34486.79899634464203
4.7341.31999.7793.44486.6481.99694115337115
4.7739.331011.3268.98494.91487.0395505377602
4.7842.851013.3993.36481.47482.7127506383782
4.8338.441015.3572.94485.32486.9191795580812
4.8639.41012.7391.39488.63483.6685673220653
4.8945.871007.5899.35482.69480.3452494934288
4.9542.071004.8780.88485.67483.6820847979367
4.9639.41003.5892.22486.09482.5556900620063
4.9640.071011.867.38494.75486.76704382567345
5.0740.071019.3266.17494.87487.39276206190476
5.1940.781025.2495.07482.46483.2391853805797
5.2438.681018.0378.65486.67485.46804748846023
5.2540.071019.4867.7495.23486.8376282047915
5.2842.071003.8280.84485.24482.9664790826128
5.3535.571027.1280.81488.65486.52337424855034
5.4145.871008.4797.51479.48479.7020461747409
5.4740.621018.6683.61481.56483.8603707725907
5.5239.331009.7495.25492.39481.59632996620337
5.5335.791013.4894.19484.25482.9589094130443
5.6540.721022.4685.17487.09483.5935440196594
5.66401022.0893.03475.54482.56492081062197
5.6640.621015.8784.97485.18483.05341208868646
5.6745.871008.9193.29478.44479.8666424772226
5.740.621016.0784.94482.82482.99898248352287
5.7141.311003.2489.48485.83481.0140181620884
5.7240.811025.7892.46487.8482.6522450331836
5.7340.351012.2491.84490.5481.65803626550104
5.7645.871010.8395.79481.4479.4940275935438
5.7938.681017.1970.46487.4485.55280779089935
5.845.871009.1492.06481.6479.82004524900304
5.8145.871009.6394.38479.66479.5016658987375
5.8240.781024.8296.01470.02481.8616297971032
5.8540.771022.4484.77480.59483.25643025675544
5.8939.481005.1159.83484.91485.6707676138384
5.9740.351012.394.1489.03480.8720151235934
5.9839.611017.2784.86482.17482.8376771392271
5.9935.791011.5691.69484.82482.28195572617585
6.0135.791011.0591.33482.25482.25230635662086
6.0541.141027.6986.93481.02482.9211493842233
6.0641.171019.6784.7489.62482.5224032770931
6.0641.171019.6784.7489.62482.5224032770931
6.0741.141027.5786.98480.19482.8651227775144
6.0943.651020.9571.15485.96483.9457142125588
6.1340.811026.3191.66483.8482.03413003220066
6.1439.41011.2190.87485.94481.1697787554499
6.1736.251028.6890.59483.77483.4800950250837
6.1739.331012.5793.32491.54480.887862061189
6.2841.061020.9690.91489.79481.3274744321715
6.2843.021013.7288.13487.17480.60722518885586
6.2940.781024.7596.37478.29480.90552075385773
6.3440.641020.6294.39478.78480.7766850294918
6.439.91007.7586.55486.03480.8813804440216
6.4140.811026.5793.51484.49481.2497160345687
6.4836.241013.6292.03484.65481.36256219865294
6.4840.271010.5582.12486.68481.5327774754329
6.5439.331011.5493.69491.16480.0372112529075
6.5739.371020.277.37487.94483.13326820062326
6.5939.371020.3477.92488.17483.026231339655
6.6736.081022.3183.51486.52483.05644648396395
6.6739.371019.9975.61486.84483.1836235447748
6.6936.241013.3591.09483.82481.07683881182743
6.7140.721022.7880.69483.11482.2593488420791
6.7239.851011.8484.66489.09480.91956153647465
6.7539.371020.2670.99486.26483.7358441723225
6.7539.91008.387.42484.05480.13324493534134
6.7636.251028.3191.16484.36482.2378034745739
6.839.371020.2473.29487.33483.29951117842876
6.8137.491010.7488.25482.21480.72127979674934
6.8138.561016.570.99487.45483.4983346847084
6.8241.031022.1287.63489.64480.8896654047192
6.8439.41011.993.75484.09479.4695661535221
6.8640.021031.577.94476.45483.31837237370024
6.8641.381021.3590.78487.49480.1926904623461
6.8642.491007.9593.96486.14478.2709460554042
6.8740.071017.9157.64491.4485.0924630969619
6.8939.371020.2174.17486.9482.99537247457505
6.8943.651019.8772.77484482.0857905249771
6.9136.081021.8284.31486.37482.4376580053312
6.9137.491011.0582.07481.88481.46887563754206
6.9340.671020.1771.16494.61483.02945770729485
6.9341.141027.1884.67479.06481.53054017882704
6.9438.911018.9490.64485.12480.47697065614113

Showing the first 1000 rows.

Now that we have real predictions we can use an evaluation metric such as Root Mean Squared Error to validate our regression model. The lower the Root Mean Squared Error, the better our model.

//Now let's compute some evaluation metrics against our test dataset

import org.apache.spark.mllib.evaluation.RegressionMetrics 

val metrics = new RegressionMetrics(predictionsAndLabels.select("Predicted_PE", "PE").rdd.map(r => (r(0).asInstanceOf[Double], r(1).asInstanceOf[Double])))
import org.apache.spark.mllib.evaluation.RegressionMetrics metrics: org.apache.spark.mllib.evaluation.RegressionMetrics = org.apache.spark.mllib.evaluation.RegressionMetrics@7290c3ef
val rmse = metrics.rootMeanSquaredError
rmse: Double = 4.609375859170583
val explainedVariance = metrics.explainedVariance
explainedVariance: Double = 274.54186073318266
val r2 = metrics.r2
r2: Double = 0.9308377700269259
println (f"Root Mean Squared Error: $rmse")
println (f"Explained Variance: $explainedVariance")  
println (f"R2: $r2")
Root Mean Squared Error: 4.609375859170583 Explained Variance: 274.54186073318266 R2: 0.9308377700269259

Generally a good model will have 68% of predictions within 1 RMSE and 95% within 2 RMSE of the actual value. Let's calculate and see if our RMSE meets this criteria.

display(predictionsAndLabels) // recall the DataFrame predictionsAndLabels
1.8139.421026.9276.97490.55[1,4,[],[1.81,39.42,1026.92,76.97]]492.8503868481024
3.241.31997.6798.84489.86[1,4,[],[3.2,41.31,997.67,98.84]]483.9368120270272
3.3841.31998.7997.76489.11[1,4,[],[3.38,41.31,998.79,97.76]]483.850459922409
3.439.641011.183.43459.86[1,4,[],[3.4,39.64,1011.1,83.43]]487.4251507226833
3.5135.471017.5386.56489.07[1,4,[],[3.51,35.47,1017.53,86.56]]488.37401129434335
3.6338.441016.1687.38487.87[1,4,[],[3.63,38.44,1016.16,87.38]]487.1505396071426
3.9135.471016.9286.03488.67[1,4,[],[3.91,35.47,1016.92,86.03]]487.6355351796776
3.9439.91008.0697.49488.81[1,4,[],[3.94,39.9,1008.06,97.49]]483.9896378767201
439.91009.6497.16490.79[1,4,[],[4,39.9,1009.64,97.16]]484.0618847149547
4.1539.91007.6295.69489.8[1,4,[],[4.15,39.9,1007.62,95.69]]483.8158776062654
4.1539.91008.8496.68491.22[1,4,[],[4.15,39.9,1008.84,96.68]]483.77650720118083
4.2338.441016.4676.64489[1,4,[],[4.23,38.44,1016.46,76.64]]487.61554926022393
4.2439.91009.2896.74491.25[1,4,[],[4.24,39.9,1009.28,96.74]]483.6343648504441
4.4338.911019.0488.17491.9[1,4,[],[4.43,38.91,1019.04,88.17]]485.6397981724803
4.4438.441016.1475.35486.53[1,4,[],[4.44,38.44,1016.14,75.35]]487.37706899378225
4.6140.271012.3277.28492.85[1,4,[],[4.61,40.27,1012.32,77.28]]485.96972834538735
4.6535.191018.2394.78489.36[1,4,[],[4.65,35.19,1018.23,94.78]]485.11862159667663
4.6939.421024.5879.35486.34[1,4,[],[4.69,39.42,1024.58,79.35]]486.79899634464203
4.7341.31999.7793.44486.6[1,4,[],[4.73,41.31,999.77,93.44]]481.99694115337115
4.7739.331011.3268.98494.91[1,4,[],[4.77,39.33,1011.32,68.98]]487.0395505377602
4.7842.851013.3993.36481.47[1,4,[],[4.78,42.85,1013.39,93.36]]482.7127506383782
4.8338.441015.3572.94485.32[1,4,[],[4.83,38.44,1015.35,72.94]]486.9191795580812
4.8639.41012.7391.39488.63[1,4,[],[4.86,39.4,1012.73,91.39]]483.6685673220653
4.8945.871007.5899.35482.69[1,4,[],[4.89,45.87,1007.58,99.35]]480.3452494934288
4.9542.071004.8780.88485.67[1,4,[],[4.95,42.07,1004.87,80.88]]483.6820847979367
4.9639.41003.5892.22486.09[1,4,[],[4.96,39.4,1003.58,92.22]]482.5556900620063
4.9640.071011.867.38494.75[1,4,[],[4.96,40.07,1011.8,67.38]]486.76704382567345
5.0740.071019.3266.17494.87[1,4,[],[5.07,40.07,1019.32,66.17]]487.39276206190476
5.1940.781025.2495.07482.46[1,4,[],[5.19,40.78,1025.24,95.07]]483.2391853805797
5.2438.681018.0378.65486.67[1,4,[],[5.24,38.68,1018.03,78.65]]485.46804748846023
5.2540.071019.4867.7495.23[1,4,[],[5.25,40.07,1019.48,67.7]]486.8376282047915
5.2842.071003.8280.84485.24[1,4,[],[5.28,42.07,1003.82,80.84]]482.9664790826128
5.3535.571027.1280.81488.65[1,4,[],[5.35,35.57,1027.12,80.81]]486.52337424855034
5.4145.871008.4797.51479.48[1,4,[],[5.41,45.87,1008.47,97.51]]479.7020461747409
5.4740.621018.6683.61481.56[1,4,[],[5.47,40.62,1018.66,83.61]]483.8603707725907
5.5239.331009.7495.25492.39[1,4,[],[5.52,39.33,1009.74,95.25]]481.59632996620337
5.5335.791013.4894.19484.25[1,4,[],[5.53,35.79,1013.48,94.19]]482.9589094130443
5.6540.721022.4685.17487.09[1,4,[],[5.65,40.72,1022.46,85.17]]483.5935440196594
5.66401022.0893.03475.54[1,4,[],[5.66,40,1022.08,93.03]]482.56492081062197
5.6640.621015.8784.97485.18[1,4,[],[5.66,40.62,1015.87,84.97]]483.05341208868646
5.6745.871008.9193.29478.44[1,4,[],[5.67,45.87,1008.91,93.29]]479.8666424772226
5.740.621016.0784.94482.82[1,4,[],[5.7,40.62,1016.07,84.94]]482.99898248352287
5.7141.311003.2489.48485.83[1,4,[],[5.71,41.31,1003.24,89.48]]481.0140181620884
5.7240.811025.7892.46487.8[1,4,[],[5.72,40.81,1025.78,92.46]]482.6522450331836
5.7340.351012.2491.84490.5[1,4,[],[5.73,40.35,1012.24,91.84]]481.65803626550104
5.7645.871010.8395.79481.4[1,4,[],[5.76,45.87,1010.83,95.79]]479.4940275935438
5.7938.681017.1970.46487.4[1,4,[],[5.79,38.68,1017.19,70.46]]485.55280779089935
5.845.871009.1492.06481.6[1,4,[],[5.8,45.87,1009.14,92.06]]479.82004524900304
5.8145.871009.6394.38479.66[1,4,[],[5.81,45.87,1009.63,94.38]]479.5016658987375
5.8240.781024.8296.01470.02[1,4,[],[5.82,40.78,1024.82,96.01]]481.8616297971032
5.8540.771022.4484.77480.59[1,4,[],[5.85,40.77,1022.44,84.77]]483.25643025675544
5.8939.481005.1159.83484.91[1,4,[],[5.89,39.48,1005.11,59.83]]485.6707676138384
5.9740.351012.394.1489.03[1,4,[],[5.97,40.35,1012.3,94.1]]480.8720151235934
5.9839.611017.2784.86482.17[1,4,[],[5.98,39.61,1017.27,84.86]]482.8376771392271
5.9935.791011.5691.69484.82[1,4,[],[5.99,35.79,1011.56,91.69]]482.28195572617585
6.0135.791011.0591.33482.25[1,4,[],[6.01,35.79,1011.05,91.33]]482.25230635662086
6.0541.141027.6986.93481.02[1,4,[],[6.05,41.14,1027.69,86.93]]482.9211493842233
6.0641.171019.6784.7489.62[1,4,[],[6.06,41.17,1019.67,84.7]]482.5224032770931
6.0641.171019.6784.7489.62[1,4,[],[6.06,41.17,1019.67,84.7]]482.5224032770931
6.0741.141027.5786.98480.19[1,4,[],[6.07,41.14,1027.57,86.98]]482.8651227775144
6.0943.651020.9571.15485.96[1,4,[],[6.09,43.65,1020.95,71.15]]483.9457142125588
6.1340.811026.3191.66483.8[1,4,[],[6.13,40.81,1026.31,91.66]]482.03413003220066
6.1439.41011.2190.87485.94[1,4,[],[6.14,39.4,1011.21,90.87]]481.1697787554499
6.1736.251028.6890.59483.77[1,4,[],[6.17,36.25,1028.68,90.59]]483.4800950250837
6.1739.331012.5793.32491.54[1,4,[],[6.17,39.33,1012.57,93.32]]480.887862061189
6.2841.061020.9690.91489.79[1,4,[],[6.28,41.06,1020.96,90.91]]481.3274744321715
6.2843.021013.7288.13487.17[1,4,[],[6.28,43.02,1013.72,88.13]]480.60722518885586
6.2940.781024.7596.37478.29[1,4,[],[6.29,40.78,1024.75,96.37]]480.90552075385773
6.3440.641020.6294.39478.78[1,4,[],[6.34,40.64,1020.62,94.39]]480.7766850294918
6.439.91007.7586.55486.03[1,4,[],[6.4,39.9,1007.75,86.55]]480.8813804440216
6.4140.811026.5793.51484.49[1,4,[],[6.41,40.81,1026.57,93.51]]481.2497160345687
6.4836.241013.6292.03484.65[1,4,[],[6.48,36.24,1013.62,92.03]]481.36256219865294
6.4840.271010.5582.12486.68[1,4,[],[6.48,40.27,1010.55,82.12]]481.5327774754329
6.5439.331011.5493.69491.16[1,4,[],[6.54,39.33,1011.54,93.69]]480.0372112529075
6.5739.371020.277.37487.94[1,4,[],[6.57,39.37,1020.2,77.37]]483.13326820062326
6.5939.371020.3477.92488.17[1,4,[],[6.59,39.37,1020.34,77.92]]483.026231339655
6.6736.081022.3183.51486.52[1,4,[],[6.67,36.08,1022.31,83.51]]483.05644648396395
6.6739.371019.9975.61486.84[1,4,[],[6.67,39.37,1019.99,75.61]]483.1836235447748
6.6936.241013.3591.09483.82[1,4,[],[6.69,36.24,1013.35,91.09]]481.07683881182743
6.7140.721022.7880.69483.11[1,4,[],[6.71,40.72,1022.78,80.69]]482.2593488420791
6.7239.851011.8484.66489.09[1,4,[],[6.72,39.85,1011.84,84.66]]480.91956153647465
6.7539.371020.2670.99486.26[1,4,[],[6.75,39.37,1020.26,70.99]]483.7358441723225
6.7539.91008.387.42484.05[1,4,[],[6.75,39.9,1008.3,87.42]]480.13324493534134
6.7636.251028.3191.16484.36[1,4,[],[6.76,36.25,1028.31,91.16]]482.2378034745739
6.839.371020.2473.29487.33[1,4,[],[6.8,39.37,1020.24,73.29]]483.29951117842876
6.8137.491010.7488.25482.21[1,4,[],[6.81,37.49,1010.74,88.25]]480.72127979674934
6.8138.561016.570.99487.45[1,4,[],[6.81,38.56,1016.5,70.99]]483.4983346847084
6.8241.031022.1287.63489.64[1,4,[],[6.82,41.03,1022.12,87.63]]480.8896654047192
6.8439.41011.993.75484.09[1,4,[],[6.84,39.4,1011.9,93.75]]479.4695661535221
6.8640.021031.577.94476.45[1,4,[],[6.86,40.02,1031.5,77.94]]483.31837237370024
6.8641.381021.3590.78487.49[1,4,[],[6.86,41.38,1021.35,90.78]]480.1926904623461
6.8642.491007.9593.96486.14[1,4,[],[6.86,42.49,1007.95,93.96]]478.2709460554042
6.8740.071017.9157.64491.4[1,4,[],[6.87,40.07,1017.91,57.64]]485.0924630969619
6.8939.371020.2174.17486.9[1,4,[],[6.89,39.37,1020.21,74.17]]482.99537247457505
6.8943.651019.8772.77484[1,4,[],[6.89,43.65,1019.87,72.77]]482.0857905249771
6.9136.081021.8284.31486.37[1,4,[],[6.91,36.08,1021.82,84.31]]482.4376580053312
6.9137.491011.0582.07481.88[1,4,[],[6.91,37.49,1011.05,82.07]]481.46887563754206
6.9340.671020.1771.16494.61[1,4,[],[6.93,40.67,1020.17,71.16]]483.02945770729485
6.9341.141027.1884.67479.06[1,4,[],[6.93,41.14,1027.18,84.67]]481.53054017882704
6.9438.911018.9490.64485.12[1,4,[],[6.94,38.91,1018.94,90.64]]480.47697065614113

Showing the first 1000 rows.

// First we calculate the residual error and divide it by the RMSE from predictionsAndLabels DataFrame and make another DataFrame that is registered as a temporary table Power_Plant_RMSE_Evaluation
predictionsAndLabels.selectExpr("PE", "Predicted_PE", "PE - Predicted_PE AS Residual_Error", s""" (PE - Predicted_PE) / $rmse AS Within_RSME""").createOrReplaceTempView("Power_Plant_RMSE_Evaluation")
%sql SELECT * from Power_Plant_RMSE_Evaluation
490.55492.8503868481024-2.3003868481023915-0.49906688419119855
489.86483.93681202702725.9231879729728121.2850303715606821
489.11483.8504599224095.2595400775909981.1410525499080058
459.86487.4251507226833-27.565150722683313-5.980234974295072
489.07488.374011294343350.69598870565664580.15099413172652035
487.87487.15053960714260.71946039285739970.1560862934243033
488.67487.63553517967761.03446482032239830.22442622427161782
488.81483.98963787672014.8203621232798921.045773282664624
490.79484.06188471495476.7281152850453051.4596586372229519
489.8483.81587760626545.9841223937345941.2982500400415133
491.22483.776507201180837.4434927988191931.6148591536552597
489487.615549260223931.38445073977607080.30035535874594327
491.25483.63436485044417.6156351495558851.6522052838030554
491.9485.63979817248036.2602018275196661.3581452280713195
486.53487.37706899378225-0.8470689937822726-0.1837708660917696
492.85485.969728345387356.880271654612671.4926688265015375
489.36485.118621596676634.2413784033233810.9201632786974722
486.34486.79899634464203-0.45899634464205974-0.09957884942900971
486.6481.996941153371154.6030588466288690.9986295297379263
494.91487.03955053776027.8704494622398331.707487022691192
481.47482.7127506383782-1.2427506383781974-0.26961364756264844
485.32486.9191795580812-1.5991795580812322-0.346940585220358
488.63483.66856732206534.9614326779346811.076378414240979
482.69480.34524949342882.3447505065711880.5086915405056825
485.67483.68208479793671.98791520206333420.4312764380253951
486.09482.55569006200633.5343099379936690.766765402947556
494.75486.767043825673457.9829561743265461.7318952539841284
494.87487.392762061904767.4772379380952431.6221801316590196
482.46483.2391853805797-0.7791853805797473-0.16904357648108023
486.67485.468047488460231.20195251153978690.26076253016955486
495.23486.83762820479158.3923717952085331.8207176094159236
485.24482.96647908261282.2735209173872020.49323834437669456
488.65486.523374248550342.12662575144963740.4613695685541915
479.48479.7020461747409-0.22204617474085353-0.048172720456085526
481.56483.8603707725907-2.3003707725907248-0.49906339662321586
492.39481.5963299662033710.7936700337966162.3416771301741583
484.25482.95890941304431.29109058695570410.2801009564856845
487.09483.59354401965943.49645598034055640.7585530204450963
475.54482.56492081062197-7.02492081062195-1.5240503324643226
485.18483.053412088686462.12658791131354970.4613613591702654
478.44479.8666424772226-1.4266424772226287-0.30950881872309294
482.82482.99898248352287-0.17898248352287283-0.03883009088243005
485.83481.01401816208844.8159818379115791.044822983643207
487.8482.65224503318365.14775496681642151.1168008693790303
490.5481.658036265501048.8419637344989611.9182561814540322
481.4479.49402759354381.90597240645615780.41349902127511057
487.4485.552807790899351.84719220910062630.4007467096495387
481.6479.820045249003041.77995475099697840.3861596028138321
479.66479.50166589873750.158334101262539660.03435044268466979
470.02481.8616297971032-11.841629797103224-2.5690310703440926
480.59483.25643025675544-2.6664302567554614-0.5784796766899505
484.91485.6707676138384-0.7607676138383681-0.16504785833960212
489.03480.87201512359348.1579848764065451.7698675754930742
482.17482.8376771392271-0.6676771392270666-0.14485196252735383
484.82482.281955726175852.53804427382414130.55062645168642
482.25482.25230635662086-0.0023063566208634256-0.0005003620210911666
481.02482.9211493842233-1.901149384223345-0.41245267088404464
489.62482.52240327709317.0975967229069281.5398173071058865
489.62482.52240327709317.0975967229069281.5398173071058865
480.19482.8651227775144-2.675122777514389-0.5803655113505441
485.96483.94571421255882.01428578744116750.43699751310877494
483.8482.034130032200661.76586996779934680.3831039216049307
485.94481.16977875544994.77022124455011.0348952635440887
483.77483.48009502508370.289904974916282750.06289462690257779
491.54480.88786206118910.6521379388109952.3109718678328295
489.79481.32747443217158.4625255678284931.8359374080965598
487.17480.607225188855866.5627748111441521.423788168215266
478.29480.90552075385773-2.6155207538577088-0.5674349052386345
478.78480.7766850294918-1.9966850294918004-0.43317904429930487
486.03480.88138044402165.1486195559783711.11698844123005
484.49481.24971603456873.2402839654313310.7029767292646845
484.65481.362562198652943.2874378013470390.7132067120988882
486.68481.53277747543295.1472225245670981.116685356505793
491.16480.037211252907511.12278874709252.4130791428004637
487.94483.133268200623264.8067317993767351.0428161959961462
488.17483.0262313396555.14376866034501751.1159360437294854
486.52483.056446483963953.4635535160360290.7514148600282003
486.84483.18362354477483.65637645522519960.7932476254785464
483.82481.076838811827432.74316118817256440.5951263841317928
483.11482.25934884207910.85065115792093590.1845480134210629
489.09480.919561536474658.1704384635253291.7725693701610024
486.26483.73584417232252.52415582767747540.5476133656263986
484.05480.133244935341343.91675506465867330.8497365336059749
484.36482.23780347457392.1221965254260910.46040865190107577
487.33483.299511178428764.0304888215712250.8744109711843887
482.21480.721279796749341.48872020325063660.3229765262662956
487.45483.49833468470843.95166531529156370.8573102814841035
489.64480.88966540471928.7503345952807761.8983773210578065
484.09479.46956615352214.6204338464778521.0023990205279676
476.45483.31837237370024-6.868372373700254-1.4900872880729146
487.49480.19269046234617.2973095376539161.5831448249410072
486.14478.27094605540427.8690539445958051.7071842663773948
491.4485.09246309696196.3075369030380561.3684145306764033
486.9482.995372474575053.9046275254249280.8471054747372092
484482.08579052497711.91420947502291480.4152860459870071
486.37482.43765800533123.9323419946688320.8531181042321038
481.88481.468875637542060.411124362457940150.08919306539951342
494.61483.0294577072948511.5805422927051612.5123883680835215
479.06481.53054017882704-2.4705401788270365-0.5359814982134238
485.12480.476970656141134.6430293438588711.0073010936223248

Showing the first 1000 rows.

%sql -- Now we can display the RMSE as a Histogram. Clearly this shows that the RMSE is centered around 0 with the vast majority of the error within 2 RMSEs.
SELECT Within_RSME  from Power_Plant_RMSE_Evaluation
0.000.020.040.060.080.100.120.140.160.180.20-6.0-5.5-5.0-4.5-4.0-3.5-3.0-2.5-2.0-1.5-1.00-0.500.000.501.001.52.02.53.03.54.0Within_RSMEDensityDensity

Showing sample based on the first 1000 rows.

We can see this definitively if we count the number of predictions within + or - 1.0 and + or - 2.0 and display this as a pie chart:

%sql 
SELECT case when Within_RSME <= 1.0 and Within_RSME >= -1.0 then 1  when  Within_RSME <= 2.0 and Within_RSME >= -2.0 then 2 else 3 end RSME_Multiple, COUNT(*) count  from Power_Plant_RMSE_Evaluation
group by case when Within_RSME <= 1.0 and Within_RSME >= -1.0 then 1  when  Within_RSME <= 2.0 and Within_RSME >= -2.0 then 2 else 3 end
13267%3%30%RSME_Multiple113322

So we have about 70% of our training data within 1 RMSE and about 97% (70% + 27%) within 2 RMSE. So the model is pretty decent. Let's see if we can tune the model to improve it further.

NOTE: these numbers will vary across runs due to the seed in random sampling of training and test set, number of iterations, and other stopping rules in optimization, for example.

Step 7: Tuning and Evaluation

Now that we have a model with all of the data let's try to make a better model by tuning over several parameters.

import org.apache.spark.ml.tuning.{ParamGridBuilder, CrossValidator}
import org.apache.spark.ml.evaluation._
import org.apache.spark.ml.tuning.{ParamGridBuilder, CrossValidator} import org.apache.spark.ml.evaluation._

First let's use a cross validator to split the data into training and validation subsets. See http://spark.apache.org/docs/latest/ml-tuning.html.

//Let's set up our evaluator class to judge the model based on the best root mean squared error
val regEval = new RegressionEvaluator()
regEval.setLabelCol("PE")
  .setPredictionCol("Predicted_PE")
  .setMetricName("rmse")
regEval: org.apache.spark.ml.evaluation.RegressionEvaluator = regEval_7e20af62a956 res44: regEval.type = regEval_7e20af62a956

We now treat the lrPipeline as an Estimator, wrapping it in a CrossValidator instance.

This will allow us to jointly choose parameters for all Pipeline stages.

A CrossValidator requires an Estimator, an Evaluator (which we set next).

//Let's create our crossvalidator with 3 fold cross validation
val crossval = new CrossValidator()
crossval.setEstimator(lrPipeline)
crossval.setNumFolds(3)
crossval.setEvaluator(regEval)
crossval: org.apache.spark.ml.tuning.CrossValidator = cv_db8f394bea34 res45: crossval.type = cv_db8f394bea34

A CrossValidator also requires a set of EstimatorParamMaps which we set next.

For this we need a regularization parameter (more generally a hyper-parameter that is model-specific).

Now, let's tune over our regularization parameter from 0.01 to 0.10.