ScaDaMaLe Course site and book

Utilities Needed for Financial Data

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


Here, certain delta.io tables are loaded. These tables have been prepared already.

You will not be able to load them directly but libraries used in the process are all open-sourced.

object TrendUtils {
  private val fx1mPath = "s3a://XXXXX/findata/com/histdata/free/FX-1-Minute-Data/"
  private val trendCalculusCheckpointPath = "s3a://XXXXX/summerinterns2020/trend-calculus-blog/public/"
  private val streamableTrendCalculusPath = "s3a://XXXXX/summerinterns2020/johannes/streamable-trend-calculus/"
  private val yfinancePath = "s3a://XXXXX/summerinterns2020/yfinance/"
  
  def getFx1mPath = fx1mPath
  def getTrendCalculusCheckpointPath = trendCalculusCheckpointPath
  def getStreamableTrendCalculusPath = streamableTrendCalculusPath
  def getYfinancePath = yfinancePath
}
defined object TrendUtils
class TrendUtils:
  
  def getTrendCalculusCheckpointPath():
    return "s3a://XXXXX/summerinterns2020/trend-calculus-blog/public/"
  
  def getYfinancePath():
    return "s3a://XXXXX/summerinterns2020/yfinance/"

ScaDaMaLe Course site and book

Utilities Needed for Mass Media Data

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


Here, certain delta.io tables are loaded. These tables have been prepared already.

You will not be able to load them directly but libraries used in the process are all open-sourced.

object GdeltUtils {
  private val gdeltV1Path = "s3a://XXXXXX/GDELT/delta/bronze/v1/"
  private val eoiCheckpointPath = "s3a://XXXXXX/.../texata/"
  private val poiCheckpointPath = "s3a://XXXXXX/.../person_graph/"

  
  def getGdeltV1Path = gdeltV1Path
  def getEOICheckpointPath = eoiCheckpointPath
  def getPOICheckpointPath = poiCheckpointPath
}
defined object GdeltUtils
class GdeltUtils:
  
  def getEOICheckpointPath():
    return "s3a://XXXXXX/.../texata/"

ScaDaMaLe Course site and book

Trends in Financial Stocks and News Events

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


According to Merriam-Webster Dictionary the definitionof trend is as follows:

a prevailing tendency or inclination : drift. How to use trend in a sentence. Synonym Discussion of trend.

Since people invest in financial stocks of publicly traded companies and make these decisions based on their understanding of current events reported in mass media, a natural question is:

How can one try to represent and understand this interplay?

The following material, first goes through the ETL process to ingest:

  • financial data and then
  • mass-media data

in a structured manner so that one can begin scalable data science processes upon them.

In the sequel two libraries are used to take advantage of SparkSQL and delta.io tables ("Spark on ACID"):

  • for encoding and interpreting trends (so-called trend calculus) in any time-series, say financial stock prices, for instance.
  • for structured representaiton of the worl'd largest open-sourced mass media data:

The last few notebooks show some simple data analytics to help extract and identify events that may be related to trends of interest.

We note that the sequel here is mainly focused on the data engineering science of ETL and basic ML Pipelines. We hope it will inspire others to do more sophisticated research, including scalable causal inference and various forms of distributed deep/reinforcement learning for more sophisticated decision problems.

ScaDaMaLe Course site and book

Historical FX-1-M Financial Data

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


Resources

This notebook builds on the following repository in order to obtain SparkSQL DatSets and DataFrames of freely available FX-1-M Data so that they can be ingested into delta.io Tables:

"./000a_finance_utils"

The Trend Calculus library is needed for case classes and parsers for the data.

import org.lamastex.spark.trendcalculus._
import org.lamastex.spark.trendcalculus._
defined object TrendUtils
val filePathRoot = TrendUtils.getFx1mPath

There are many pairs of currencies and/or commodities available.

dbutils.fs.ls(filePathRoot).foreach(fi => println("exchange pair: " + fi.name))
exchange pair: audcad/
exchange pair: audchf/
exchange pair: audjpy/
exchange pair: audnzd/
exchange pair: audusd/
exchange pair: auxaud/
exchange pair: bcousd/
exchange pair: cadchf/
exchange pair: cadjpy/
exchange pair: chfjpy/
exchange pair: etxeur/
exchange pair: euraud/
exchange pair: eurcad/
exchange pair: eurchf/
exchange pair: eurczk/
exchange pair: eurdkk/
exchange pair: eurgbp/
exchange pair: eurhuf/
exchange pair: eurjpy/
exchange pair: eurnok/
exchange pair: eurnzd/
exchange pair: eurpln/
exchange pair: eursek/
exchange pair: eurtry/
exchange pair: eurusd/
exchange pair: frxeur/
exchange pair: gbpaud/
exchange pair: gbpcad/
exchange pair: gbpchf/
exchange pair: gbpjpy/
exchange pair: gbpnzd/
exchange pair: gbpusd/
exchange pair: grxeur/
exchange pair: hkxhkd/
exchange pair: jpxjpy/
exchange pair: nsxusd/
exchange pair: nzdcad/
exchange pair: nzdchf/
exchange pair: nzdjpy/
exchange pair: nzdusd/
exchange pair: sgdjpy/
exchange pair: spxusd/
exchange pair: udxusd/
exchange pair: ukxgbp/
exchange pair: usdcad/
exchange pair: usdchf/
exchange pair: usdczk/
exchange pair: usddkk/
exchange pair: usdhkd/
exchange pair: usdhuf/
exchange pair: usdjpy/
exchange pair: usdmxn/
exchange pair: usdnok/
exchange pair: usdpln/
exchange pair: usdsek/
exchange pair: usdsgd/
exchange pair: usdtry/
exchange pair: usdzar/
exchange pair: wtiusd/
exchange pair: xagusd/
exchange pair: xauaud/
exchange pair: xauchf/
exchange pair: xaueur/
exchange pair: xaugbp/
exchange pair: xauusd/
exchange pair: zarjpy/

Let's look at Brent Oil price in USD.

dbutils.fs.ls(filePathRoot + "bcousd/").foreach(fi => println("name: " + fi.name + ", size: " + fi.size))
name: DAT_ASCII_BCOUSD_M1_2010.csv.gz, size: 284384
name: DAT_ASCII_BCOUSD_M1_2010.txt.gz, size: 41157
name: DAT_ASCII_BCOUSD_M1_2011.csv.gz, size: 2479115
name: DAT_ASCII_BCOUSD_M1_2011.txt.gz, size: 216327
name: DAT_ASCII_BCOUSD_M1_2012.csv.gz, size: 2327511
name: DAT_ASCII_BCOUSD_M1_2012.txt.gz, size: 321867
name: DAT_ASCII_BCOUSD_M1_2013.csv.gz, size: 2109500
name: DAT_ASCII_BCOUSD_M1_2013.txt.gz, size: 417973
name: DAT_ASCII_BCOUSD_M1_2014.csv.gz, size: 1961172
name: DAT_ASCII_BCOUSD_M1_2014.txt.gz, size: 431591
name: DAT_ASCII_BCOUSD_M1_2015.csv.gz, size: 2205678
name: DAT_ASCII_BCOUSD_M1_2015.txt.gz, size: 333277
name: DAT_ASCII_BCOUSD_M1_2016.csv.gz, size: 2131659
name: DAT_ASCII_BCOUSD_M1_2016.txt.gz, size: 342616
name: DAT_ASCII_BCOUSD_M1_2017.csv.gz, size: 1854793
name: DAT_ASCII_BCOUSD_M1_2017.txt.gz, size: 434781
name: DAT_ASCII_BCOUSD_M1_2018.csv.gz, size: 2251250
name: DAT_ASCII_BCOUSD_M1_2018.txt.gz, size: 306810
name: DAT_ASCII_BCOUSD_M1_2019.csv.gz, size: 2701059
name: DAT_ASCII_BCOUSD_M1_2019.txt.gz, size: 102290
name: DAT_ASCII_BCOUSD_M1_202001.csv.gz, size: 233757
name: DAT_ASCII_BCOUSD_M1_202001.txt.gz, size: 8391
name: DAT_ASCII_BCOUSD_M1_202002.csv.gz, size: 222628
name: DAT_ASCII_BCOUSD_M1_202002.txt.gz, size: 4379
name: DAT_ASCII_BCOUSD_M1_202003.csv.gz, size: 265471
name: DAT_ASCII_BCOUSD_M1_202003.txt.gz, size: 834
name: DAT_ASCII_BCOUSD_M1_202004.csv.gz, size: 245819
name: DAT_ASCII_BCOUSD_M1_202004.txt.gz, size: 953
name: DAT_ASCII_BCOUSD_M1_202005.csv.gz, size: 233828
name: DAT_ASCII_BCOUSD_M1_202005.txt.gz, size: 850
name: DAT_ASCII_BCOUSD_M1_202006.csv.gz, size: 244557
name: DAT_ASCII_BCOUSD_M1_202006.txt.gz, size: 1394
name: DAT_ASCII_BCOUSD_M1_202007.csv.gz, size: 74976
name: DAT_ASCII_BCOUSD_M1_202007.txt.gz, size: 1637

We use the parser available from Trend Calculus to read the csv files into a Spark Dataset.

val oilPath = filePathRoot + "bcousd/*.csv.gz"
val oilDS = spark.read.fx1m(oilPath).orderBy($"time")
oilDS.show(20, false)
+-------------------+-----+-----+-----+-----+------+
|time               |open |high |low  |close|volume|
+-------------------+-----+-----+-----+-----+------+
|2010-11-14 20:15:00|86.73|86.74|86.73|86.74|0     |
|2010-11-14 20:17:00|86.75|86.75|86.75|86.75|0     |
|2010-11-14 20:18:00|86.76|86.78|86.76|86.76|0     |
|2010-11-14 20:19:00|86.74|86.74|86.74|86.74|0     |
|2010-11-14 20:21:00|86.75|86.75|86.74|86.74|0     |
|2010-11-14 20:24:00|86.75|86.75|86.75|86.75|0     |
|2010-11-14 20:26:00|86.76|86.77|86.74|86.77|0     |
|2010-11-14 20:27:00|86.79|86.79|86.75|86.75|0     |
|2010-11-14 20:28:00|86.77|86.79|86.75|86.79|0     |
|2010-11-14 20:32:00|86.8 |86.81|86.79|86.81|0     |
|2010-11-14 20:33:00|86.81|86.81|86.81|86.81|0     |
|2010-11-14 20:34:00|86.81|86.81|86.81|86.81|0     |
|2010-11-14 20:35:00|86.81|86.81|86.79|86.79|0     |
|2010-11-14 20:36:00|86.79|86.8 |86.78|86.8 |0     |
|2010-11-14 20:37:00|86.78|86.8 |86.78|86.79|0     |
|2010-11-14 20:38:00|86.79|86.79|86.79|86.79|0     |
|2010-11-14 20:39:00|86.79|86.8 |86.79|86.79|0     |
|2010-11-14 20:40:00|86.79|86.8 |86.79|86.8 |0     |
|2010-11-14 20:41:00|86.8 |86.8 |86.8 |86.8 |0     |
|2010-11-14 20:42:00|86.8 |86.8 |86.8 |86.8 |0     |
+-------------------+-----+-----+-----+-----+------+
only showing top 20 rows

ScaDaMaLe Course site and book

yfinance Stock Data

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


This notebook builds on the following repositories in order to obtain SparkSQL DataSets and DataFrames of freely available Yahoo! Finance Data so that they can be ingested into delta.io Tables for trend analysis and more:

Resources:

Yfinance is a python library that makes it easy to download various financial data from Yahoo Finance.

pip install yfinance
"./000a_finance_utils"

To illustrate the library, we use two stocks of the Swedish bank SEB. The (default) data resolution is one day so we use 20 years of data to get a lot of observations.

import yfinance as yf
dataSEBAST = yf.download("SEB-A.ST", start="2001-07-01", end="2020-07-12")
dataSEBCST = yf.download("SEB-C.ST", start="2001-07-01", end="2020-07-12")
dataSEBAST.size
dataSEBAST

We can also download several tickers at once.

Note that the result is in a pandas dataframe so we are not doing any distributed computing.

This means that some care has to be taken to not overwhelm the local machine.

defined object TrendUtils
dataSEBAandCSTBST = yf.download("SEB-A.ST SEB-C.ST", start="2020-07-01", end="2020-07-12", group_by="ticker")
dataSEBAandCSTBST
type(dataSEBAST)

Loading the data into a Spark DataFrame.

dataSEBAST_sp = spark.createDataFrame(dataSEBAandCSTBST)
dataSEBAST_sp.printSchema()

The conversion to Spark DataFrame works but is quite messy. Just imagine if there were more tickers!

dataSEBAST_sp.show(20, False)

When selecting a column with a dot in the name (as in ('SEB-A.ST', 'High')) using PySpark, we have to enclose the column name in backticks `.

dataSEBAST_sp.select("`('SEB-A.ST', 'High')`")

We can also get information about individual tickers.

msft = yf.Ticker("MSFT")
print(msft)
msft.info

We write a function to transform data downloaded by yfinance and write a better formatted Spark DataFrame.

import pandas as pd
import yfinance as yf
import sys, getopt

# example:
# python3 yfin_to_csv.py -i "60m" "SEB-A.ST INVE-A.ST" "2019-07-01" "2019-07-06" "/root/GIT/yfin_test.csv"
def ingest(interval, tickers, start, end, csv_path):
  df = yf.download(tickers, start=start, end=end, interval=interval, group_by='ticker')
  findf = df.unstack().unstack(1).sort_index(level=1)
  findf.reset_index(level=0, inplace=True)
  findf = findf.loc[start:end]
  findf.rename(columns={'level_0':'Ticker'}, inplace=True)
  findf.index.name='Time'
  findf['Volume'] = pd.to_numeric(findf['Volume'], downcast='integer')
  findf = findf.reset_index(drop=False)
  findf['Time'] = findf['Time'].map(lambda x: str(x))
  spark.createDataFrame(findf).write.mode('overwrite').save(csv_path, format='csv')
  return(findf)

Let's look at some top value companies in the world as well as an assortment of Swedish Companies.

The number of tickers is now much larger, using the previous method would result in over 100 columns.

This would make it quite difficult to see what's going on, not to mention trying to analyze the data!

The data resolution is now one minute, meaning that 7 days gives a lot of observations.

Yfinance only allows downloading 1-minute data one week at a time and only for dates within the last 30 days.

topValueCompanies = 'MSFT AAPL AMZN GOOG BABA FB BRK-B BRK-A TSLA'
swedishCompanies = 'ASSA-B.ST ATCO-A.ST ALIV-SDB.ST ELUX-A.ST ELUX-B.ST EPI-A.ST EPI-B.ST ERIC-A.ST ERIC-B.ST FORTUM.HE HUSQ-A.ST HUSQ-B.ST INVE-A.ST INVE-B.ST KESKOA.HE KESKOB.HE KNEBV.HE KCR.HE MTRS.ST SAAB-B.ST SAS.ST SEB-A.ST SEB-C.ST SKF-A.ST SKF-B.ST STE-R.ST STERV.HE WRT1V.HE'
tickers = topValueCompanies + ' ' + swedishCompanies
interval = '1m'
start = '2021-01-01'
end = '2021-01-07'
csv_path = TrendUtils.getYfinancePath() + 'stocks_' + interval + '_' + start + '_' + end + '.csv'
dbutils.fs.rm(csv_path, recurse=True)
df = ingest(interval, tickers, start, end, csv_path)

Having written the result to a csv file, we can use the parser in the Trend Calculus library https://github.com/lamastex/spark-trend-calculus to read the data into a Dataset in Scala Spark.

import org.lamastex.spark.trendcalculus._
import org.lamastex.spark.trendcalculus._
val rootPath = TrendUtils.getYfinancePath
val csv_path = rootPath + "stocks_1m_2021-01-01_2021-01-07.csv"
val yfinDF = spark.read.yfin(csv_path)
yfinDF.count
res1: Long = 71965
yfinDF.show(20, false)
+-------------------+---------+------------------+------------------+------------------+------------------+------------------+------+
|time               |ticker   |open              |high              |low               |close             |adjClose          |volume|
+-------------------+---------+------------------+------------------+------------------+------------------+------------------+------+
|2021-01-04 14:45:00|BRK-B    |230.7899932861328 |230.99000549316406|230.72999572753906|230.97000122070312|230.97000122070312|13367 |
|2021-01-04 14:45:00|ELUX-A.ST|null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|ELUX-B.ST|190.14999389648438|190.14999389648438|190.14999389648438|190.14999389648438|190.14999389648438|2770  |
|2021-01-04 14:45:00|EPI-A.ST |null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|EPI-B.ST |null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|ERIC-A.ST|null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|ERIC-B.ST|null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|FB       |271.30499267578125|271.42498779296875|271.30499267578125|271.42498779296875|271.42498779296875|28330 |
|2021-01-04 14:45:00|FORTUM.HE|20.31999969482422 |20.31999969482422 |20.31999969482422 |20.31999969482422 |20.31999969482422 |49824 |
|2021-01-04 14:45:00|GOOG     |1749.4150390625   |1751.260009765625 |1749.4150390625   |1750.18994140625  |1750.18994140625  |4109  |
|2021-01-04 14:45:00|HUSQ-A.ST|null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|HUSQ-B.ST|108.9000015258789 |108.9000015258789 |108.9000015258789 |108.9000015258789 |108.9000015258789 |7170  |
|2021-01-04 14:45:00|INVE-A.ST|null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|INVE-B.ST|612.5999755859375 |612.5999755859375 |612.4000244140625 |612.4000244140625 |612.4000244140625 |1628  |
|2021-01-04 14:45:00|KCR.HE   |29.040000915527344|29.040000915527344|29.040000915527344|29.040000915527344|29.040000915527344|0     |
|2021-01-04 14:45:00|KESKOA.HE|null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|KESKOB.HE|null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|KNEBV.HE |null              |null              |null              |null              |null              |null  |
|2021-01-04 14:45:00|MSFT     |220.25            |220.32000732421875|220.1699981689453 |220.1699981689453 |220.1699981689453 |96255 |
|2021-01-04 14:45:00|MTRS.ST  |null              |null              |null              |null              |null              |null  |
+-------------------+---------+------------------+------------------+------------------+------------------+------------------+------+
only showing top 20 rows

ScaDaMaLe Course site and book

Finding trends in oil price data.

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


Resources

This builds on the following library and its antecedents therein to find trends in historical oil prices:

This work was inspired by:

"./000a_finance_utils"

When dealing with time series, it can be difficult to find a good way to find and analyze trends in the data.

One approach is by using the Trend Calculus algorithm invented by Andrew Morgan. More information about Trend Calculus can be found at https://lamastex.github.io/spark-trend-calculus-examples/.

defined object TrendUtils
import org.lamastex.spark.trendcalculus._
import spark.implicits._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import java.sql.Timestamp
import org.lamastex.spark.trendcalculus._
import spark.implicits._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import java.sql.Timestamp

The input to the algorithm is data in the format (ticker, time, value). In this example, ticker is "BCOUSD" (Brent Crude Oil), time is given in minutes and value is the closing price for Brent Crude Oil during that minute.

This data is historical data from 2010 to 2019 taken from https://www.histdata.com/ using methods from FX-1-Minute-Data by Philippe Remy. In this notebook, everything is done on static dataframes. We will soon see examples on streaming dataframes.

There are gaps in the data, notably during the weekends when no trading takes place, but this does not affect the algorithm as it is does not place any assumptions on the data other than that time is monotonically increasing.

The window size is set to 2, which is minimal, because we want to retain as much information as possible.

val windowSize = 2
val dataRootPath = TrendUtils.getFx1mPath
val oilDS = spark.read.fx1m(dataRootPath + "bcousd/*.csv.gz").toDF.withColumn("ticker", lit("BCOUSD")).select($"ticker", $"time" as "x", $"close" as "y").as[TickerPoint].orderBy("x")

If we want to look at long term trends, we can use the output time series as input for another iteration. The output contains the points of the input where the trend changes (reversals). This can be repeated several times, resulting in longer term trends.

Here, we look at (up to) 15 iterations of the algorithm. It is no problem if the output of some iteration is too small to find a reversal in the next iteration, since the output will just be an empty dataframe in that case.

val numReversals = 15
val dfWithReversals = new TrendCalculus2(oilDS, windowSize, spark).nReversalsJoinedWithMaxRev(numReversals)
numReversals: Int = 15
dfWithReversals: org.apache.spark.sql.DataFrame = [ticker: string, x: timestamp ... 17 more fields]
dfWithReversals.show(20, false)
+------+-------------------+-----+---------+---------+---------+---------+---------+---------+---------+---------+---------+----------+----------+----------+----------+----------+----------+------+
|ticker|x                  |y    |reversal1|reversal2|reversal3|reversal4|reversal5|reversal6|reversal7|reversal8|reversal9|reversal10|reversal11|reversal12|reversal13|reversal14|reversal15|maxRev|
+------+-------------------+-----+---------+---------+---------+---------+---------+---------+---------+---------+---------+----------+----------+----------+----------+----------+----------+------+
|BCOUSD|2010-11-14 20:15:00|86.74|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:17:00|86.75|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:18:00|86.76|-1       |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |1     |
|BCOUSD|2010-11-14 20:19:00|86.74|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:21:00|86.74|1        |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |1     |
|BCOUSD|2010-11-14 20:24:00|86.75|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:26:00|86.77|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:27:00|86.75|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:28:00|86.79|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:32:00|86.81|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:33:00|86.81|-1       |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |1     |
|BCOUSD|2010-11-14 20:34:00|86.81|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:35:00|86.79|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:36:00|86.8 |null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:37:00|86.79|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:38:00|86.79|1        |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |1     |
|BCOUSD|2010-11-14 20:39:00|86.79|null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:40:00|86.8 |null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:41:00|86.8 |null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
|BCOUSD|2010-11-14 20:42:00|86.8 |null     |null     |null     |null     |null     |null     |null     |null     |null     |null      |null      |null      |null      |null      |null      |0     |
+------+-------------------+-----+---------+---------+---------+---------+---------+---------+---------+---------+---------+----------+----------+----------+----------+----------+----------+------+
only showing top 20 rows

The number of reversals decrease rapidly as more iterations are done.

dfWithReversals.cache.count
res3: Long = 2859310
(1 to numReversals).foreach( i => println(dfWithReversals.filter(s"reversal$i is not null").count))
775283
253258
93804
37068
15397
6595
2858
1240
530
230
96
45
25
11
6

We write the resulting dataframe to parquet in order to produce visualizations using Python.

val checkPointPath = TrendUtils.getTrendCalculusCheckpointPath
dfWithReversals.write.mode(SaveMode.Overwrite).parquet(checkPointPath + "joinedDSWithMaxRev_new")
dfWithReversals.unpersist

Visualization

The Python library plotly is used to make interactive visualizations.

from plotly.offline import plot
from plotly.graph_objs import *
from datetime import *
checkPointPath = TrendUtils.getTrendCalculusCheckpointPath()
joinedDS = spark.read.parquet(checkPointPath + "joinedDSWithMaxRev_new").orderBy("x")

We check the size of the dataframe to see if it possible to handle locally since plotly is not available for distributed data.

joinedDS.count()

Almost 3 million rows might be too much for the driver! The timeseries has to be thinned out in order to display locally.

No information about higher order trend reversals is lost since every higher order reversal is also a lower order reversal and the lowest orders of reversal are on the scale of minutes (maybe hours) and that is probably not very interesting considering that the data stretches over roughly 10 years!

joinedDS.filter("maxRev > 2").count()

Just shy of 100k rows is no problem for the driver.

We select the relevant information in the dataframe for visualization.

fullTS = joinedDS.filter("maxRev > 2").select("x","y","maxRev").collect()

Picking an interval to focus on.

Start and end dates as (year, month, day, hour, minute, second). Only year, month and day are required. The interval from years 1800 to 2200 ensures all data is selected.

startDate = datetime(1800,1,1)
endDate= datetime(2200,12,31)
TS = [row for row in fullTS if startDate <= row['x'] and row['x'] <= endDate]

Setting up the visualization.

numReversals = 15
startReversal = 7

allData = {'x': [row['x'] for row in TS], 'y': [row['y'] for row in TS], 'maxRev': [row['maxRev'] for row in TS]}
revTS = [row for row in TS if row[2] >= startReversal]
colorList = ['rgba(' + str(tmp) + ',' + str(255-tmp) + ',' + str(255-tmp) + ',1)' for tmp in [int(i*255/(numReversals-startReversal+1)) for i in range(1,numReversals-startReversal+2)]]

def getRevTS(tsWithRevMax, revMax):
  x = [row[0] for row in tsWithRevMax if row[2] >= revMax]
  y = [row[1] for row in tsWithRevMax if row[2] >= revMax]
  return x,y,revMax

reducedData = [getRevTS(revTS, i) for i in range(startReversal, numReversals+1)]

markerPlots = [Scattergl(x=x, y=y, mode='markers', marker=dict(color=colorList[i-startReversal], size=i), name='Reversal ' + str(i)) for (x,y,i) in [getRevTS(revTS, i) for i in range(startReversal, numReversals+1)]]

Plotting result as plotly graph

The graph is interactive, one can drag to zoom in on an area (double-click to get back) and click on the legend to hide or show different series.

Note that we have left out many of the lower order reversals in order to not make the graph too cluttered. The seventh order reversal (the lowest order shown) is still on the scale of hours to a few days.

p = plot(
  [Scattergl(x=allData['x'], y=allData['y'], mode='lines', name='Oil Price')] + markerPlots
  ,
  output_type='div'
)

displayHTML(p)

trendcalculusoil

ScaDaMaLe Course site and book

Streaming Trend Calculus with Maximum Necessary Reversals

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


Resources

This builds on the following library and its antecedents therein:

This work was inspired by:

"./000a_finance_utils"

We use the spark-trend-calculus library and Spark structured streams over delta.io files to obtain a representation of the complete time series of trends with their k-th order reversal.

This representation is a sufficient statistic for a Markov model of trends that we show in the next notebook.

defined object TrendUtils
import java.sql.Timestamp
import io.delta.tables._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.streaming.{GroupState, GroupStateTimeout, OutputMode, Trigger}
import org.apache.spark.sql.types._
import org.apache.spark.sql.expressions.{Window, WindowSpec}
import org.lamastex.spark.trendcalculus._
import java.sql.Timestamp
import io.delta.tables._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.streaming.{GroupState, GroupStateTimeout, OutputMode, Trigger}
import org.apache.spark.sql.types._
import org.apache.spark.sql.expressions.{Window, WindowSpec}
import org.lamastex.spark.trendcalculus._

Input data in s3. The data contains oil price data from 2010 to last month and gold price data from 2009 to last month.

val rootPath = "s3a://XXXXX/summerinterns2020/johannes/streamable-trend-calculus/"
val oilGoldPath = rootPath + "oilGoldDelta"
spark.read.format("delta").load(oilGoldPath).orderBy("x").show(20,false)
+------+-------------------+------+
|ticker|x                  |y     |
+------+-------------------+------+
|XAUUSD|2009-03-15 17:00:00|929.6 |
|XAUUSD|2009-03-15 18:00:00|926.05|
|XAUUSD|2009-03-15 18:01:00|925.9 |
|XAUUSD|2009-03-15 18:02:00|925.9 |
|XAUUSD|2009-03-15 18:03:00|926.95|
|XAUUSD|2009-03-15 18:04:00|925.8 |
|XAUUSD|2009-03-15 18:05:00|926.35|
|XAUUSD|2009-03-15 18:06:00|925.8 |
|XAUUSD|2009-03-15 18:07:00|925.6 |
|XAUUSD|2009-03-15 18:08:00|925.7 |
|XAUUSD|2009-03-15 18:09:00|925.4 |
|XAUUSD|2009-03-15 18:10:00|925.75|
|XAUUSD|2009-03-15 18:11:00|925.7 |
|XAUUSD|2009-03-15 18:12:00|925.65|
|XAUUSD|2009-03-15 18:13:00|925.65|
|XAUUSD|2009-03-15 18:14:00|925.75|
|XAUUSD|2009-03-15 18:15:00|925.75|
|XAUUSD|2009-03-15 18:16:00|925.65|
|XAUUSD|2009-03-15 18:17:00|925.6 |
|XAUUSD|2009-03-15 18:18:00|925.85|
+------+-------------------+------+
only showing top 20 rows

Reading the data from s3 as a Structured Stream to simulate streaming.

val input = spark
  .readStream
  .format("delta")
  .load(oilGoldPath)
  .as[TickerPoint]
input: org.apache.spark.sql.Dataset[org.lamastex.spark.trendcalculus.TickerPoint] = [ticker: string, x: timestamp ... 1 more field]

Using the trendcalculus library to 1. Apply Trend Calculus to the streaming dataset. - Save the result as a delta table. - Read the result as a stream. - Repeat from 1. using the latest result as input. Stop when result is empty.

val windowSize = 2

// Initializing variables for while loop.
var i = 1
var prevSinkPath = ""
var sinkPath = rootPath + "multiSinks/reversal" + (i)
var chkptPath = rootPath + "multiSinks/checkpoint/" + (i)

// The first order reversal.
var stream = new TrendCalculus2(input, windowSize, spark)
  .reversals
  .select("tickerPoint.ticker", "tickerPoint.x", "tickerPoint.y", "reversal")
  .as[FlatReversal]
  .writeStream
  .format("delta")
  .option("path", sinkPath)
  .option("checkpointLocation", chkptPath)
  .trigger(Trigger.Once())
  .start

stream.processAllAvailable

i += 1

var lastReversalSeries = spark.emptyDataset[TickerPoint]
while (!spark.read.format("delta").load(sinkPath).isEmpty) {
  
  prevSinkPath = rootPath + "multiSinks/reversal" + (i-1)
  sinkPath = rootPath + "multiSinks/reversal" + (i)
  chkptPath = rootPath + "multiSinks/checkpoint/" + (i)
  
  // Reading last result as stream
  lastReversalSeries = spark
    .readStream
    .format("delta")
    .load(prevSinkPath)
    .drop("reversal")
    .as[TickerPoint]

  // Writing next result
  stream = new TrendCalculus2(lastReversalSeries, windowSize, spark)
    .reversals
    .select("tickerPoint.ticker", "tickerPoint.x", "tickerPoint.y", "reversal")
    .as[FlatReversal]
    .map( rev => rev.copy(reversal=i*rev.reversal))
    .writeStream
    .format("delta")
    .option("path", sinkPath)
    .option("checkpointLocation", chkptPath)
    .partitionBy("ticker")
    .trigger(Trigger.Once())
    .start
  
  stream.processAllAvailable()
  
  i += 1
}

Checking the total number of reversals written. The last sink is empty so the highest order reversal is \[\text{number of sinks} - 1\].

val i = dbutils.fs.ls(rootPath + "multiSinks").length - 1
i: Int = 18

The written delta tables can be read as streams but for now we read them as static datasets to be able to join them together.

val sinkPaths = (1 to i-1).map(rootPath + "multiSinks/reversal" + _)
val maxRevPath = rootPath + "maxRev"
val revTables = sinkPaths.map(DeltaTable.forPath(_).toDF.as[FlatReversal])
val oilGoldTable = DeltaTable.forPath(oilGoldPath).toDF.as[TickerPoint]

The number of reversals decrease rapidly as the reversal order increases.

revTables.map(_.cache.count)
res6: scala.collection.immutable.IndexedSeq[Long] = Vector(1954849, 677939, 262799, 108202, 46992, 21154, 9703, 4427, 1992, 890, 404, 183, 83, 35, 15, 7, 2)

Joining all results to get a dataset with all reversals in a single column.

def maxByAbs(a: Int, b: Int): Int = {
  Seq(a,b).maxBy(math.abs)
}

val maxByAbsUDF = udf((a: Int, b: Int) => maxByAbs(a,b))

val maxRevDS = revTables.foldLeft(oilGoldTable.toDF.withColumn("reversal", lit(0)).as[FlatReversal]){ (acc: Dataset[FlatReversal], ds: Dataset[FlatReversal]) => 
  acc
    .toDF
    .withColumnRenamed("reversal", "oldMaxRev")
    .join(ds.select($"ticker" as "tmpt", $"x" as "tmpx", $"reversal" as "newRev"), $"ticker" === $"tmpt" && $"x" === $"tmpx", "left")
    .drop("tmpt", "tmpx")
    .na.fill(0,Seq("newRev"))
    .withColumn("reversal", maxByAbsUDF($"oldMaxRev", $"newRev"))
    .select("ticker", "x", "y", "reversal")
    .as[FlatReversal]    
}
maxByAbs: (a: Int, b: Int)Int
maxByAbsUDF: org.apache.spark.sql.expressions.UserDefinedFunction = SparkUserDefinedFunction($Lambda$9089/155868360@d163ce5,IntegerType,List(Some(class[value[0]: int]), Some(class[value[0]: int])),None,false,true)
maxRevDS: org.apache.spark.sql.Dataset[org.lamastex.spark.trendcalculus.FlatReversal] = [ticker: string, x: timestamp ... 2 more fields]

Writing result as delta table.

maxRevDS.write.format("delta").partitionBy("ticker").save(maxRevPath)

The reversal column in the joined dataset contains the information of all orders of reversals.

0 indicates that no reversal happens while a non-zero value indicates that this is a reversal point for that order and every lower order.

For example, row 33 contains the value -4, meaning that this point is trend reversal downwards for orders 1, 2, 3, and 4.

DeltaTable.forPath(maxRevPath).toDF.as[FlatReversal].filter("ticker == 'BCOUSD'").orderBy("x").show(35, false)
+------+-------------------+-----+--------+
|ticker|x                  |y    |reversal|
+------+-------------------+-----+--------+
|BCOUSD|2010-11-14 20:15:00|86.74|0       |
|BCOUSD|2010-11-14 20:17:00|86.75|0       |
|BCOUSD|2010-11-14 20:18:00|86.76|-1      |
|BCOUSD|2010-11-14 20:19:00|86.74|0       |
|BCOUSD|2010-11-14 20:21:00|86.74|1       |
|BCOUSD|2010-11-14 20:24:00|86.75|0       |
|BCOUSD|2010-11-14 20:26:00|86.77|0       |
|BCOUSD|2010-11-14 20:27:00|86.75|0       |
|BCOUSD|2010-11-14 20:28:00|86.79|0       |
|BCOUSD|2010-11-14 20:32:00|86.81|0       |
|BCOUSD|2010-11-14 20:33:00|86.81|-1      |
|BCOUSD|2010-11-14 20:34:00|86.81|0       |
|BCOUSD|2010-11-14 20:35:00|86.79|0       |
|BCOUSD|2010-11-14 20:36:00|86.8 |0       |
|BCOUSD|2010-11-14 20:37:00|86.79|0       |
|BCOUSD|2010-11-14 20:38:00|86.79|1       |
|BCOUSD|2010-11-14 20:39:00|86.79|0       |
|BCOUSD|2010-11-14 20:40:00|86.8 |0       |
|BCOUSD|2010-11-14 20:41:00|86.8 |0       |
|BCOUSD|2010-11-14 20:42:00|86.8 |0       |
|BCOUSD|2010-11-14 20:43:00|86.82|0       |
|BCOUSD|2010-11-14 20:44:00|86.81|0       |
|BCOUSD|2010-11-14 20:47:00|86.84|0       |
|BCOUSD|2010-11-14 20:48:00|86.82|0       |
|BCOUSD|2010-11-14 20:49:00|86.84|-1      |
|BCOUSD|2010-11-14 20:50:00|86.82|0       |
|BCOUSD|2010-11-14 20:51:00|86.83|0       |
|BCOUSD|2010-11-14 20:52:00|86.82|1       |
|BCOUSD|2010-11-14 20:53:00|86.83|0       |
|BCOUSD|2010-11-14 20:54:00|86.84|0       |
|BCOUSD|2010-11-14 20:58:00|86.88|0       |
|BCOUSD|2010-11-14 20:59:00|86.89|0       |
|BCOUSD|2010-11-14 21:00:00|86.9 |-4      |
|BCOUSD|2010-11-14 21:03:00|86.87|0       |
|BCOUSD|2010-11-14 21:04:00|86.87|0       |
+------+-------------------+-----+--------+
only showing top 35 rows

ScaDaMaLe Course site and book

Detecting Events of Interest to OIL/GAS Price Trends

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


Here we will build a pipeline to investigate possible events related to oil and gas that are reported in mass media around the world and their possible co-occurrence with certain trends and trend-reversals in oil price.

Steps:

  • Step 0. Setting up and loading GDELT delta.io tables
  • Step 1. Extracting coverage around gas and oil from each country
  • Step 2. Extracting the news around dates with high coverage (big events)
  • Step 3. Enrich oil price with trend calculus and comparing it to the coverage

Resources:

This builds on the following libraries and its antecedents therein:

This work was inspired by:

import spark.implicits._
import io.delta.tables._
import com.aamend.spark.gdelt._
import org.apache.spark.sql.{Dataset,DataFrame,SaveMode}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions._
import org.lamastex.spark.trendcalculus._
import java.sql.{Date,Timestamp}
import java.text.SimpleDateFormat
import spark.implicits._
import io.delta.tables._
import com.aamend.spark.gdelt._
import org.apache.spark.sql.{Dataset, DataFrame, SaveMode}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions._
import org.lamastex.spark.trendcalculus._
import java.sql.{Date, Timestamp}
import java.text.SimpleDateFormat

Loading in the data from our live-updating delta.io tables

"./000a_finance_utils"
defined object TrendUtils
"./000b_gdelt_utils"
defined object GdeltUtils
val rootPath = GdeltUtils.getGdeltV1Path
val rootCheckpointPath = GdeltUtils.getEOICheckpointPath
val gkgPath = rootPath+"gkg"
val eventPath = rootPath+"events"
val gkg_v1 = spark.read.format("delta").load(gkgPath).as[GKGEventV1]
val eve_v1 = spark.read.format("delta").load(eventPath).as[EventV1]
  1. Extracting coverage around gas and oil from each country

Limits the data and extracts the events related to oil and gas theme

val gkg_v1_filt = gkg_v1.filter($"publishDate">"2013-04-01 00:00:00" && $"publishDate"<"2019-12-31 00:00:00")
val oil_gas_themeGKG = gkg_v1_filt.filter(c =>c.themes.contains("ENV_GAS") || c.themes.contains("ENV_OIL"))
                              .select(explode($"eventIds"))
                              .toDF("eventId")
                              .groupBy($"eventId")
                              .agg(count($"eventId"))
                              .toDF("eventId","count") 
val oil_gas_eventDF = eve_v1.toDF()
                            .join( oil_gas_themeGKG, "eventId")

Checkpoint

oil_gas_eventDF.write.parquet(rootCheckpointPath + "oil_gas_event_v1")
val  oil_gas_eventDF = spark.read.parquet(rootCheckpointPath + "oil_gas_event_v1")

Extracting coverage for each country

// Counting number of articles for each country, each day and applying a moving average on each country's coverage
def movingAverage(df:DataFrame,size:Int,avgOn:String):DataFrame = {
  val windowSpec = Window.partitionBy($"country").orderBy($"date").rowsBetween(-size/2, size/2)
  return df.withColumn("coverage",avg(avgOn).over(windowSpec))
}
val (mean_articles, std_articles) = oilEventTemp.select(mean("articles"), stddev("articles"))
  .as[(Double, Double)]
  .first() 
val oilEventTemp = oil_gas_eventDF
  .filter(length(col("eventGeo.countryCode")) > 0)
  .groupBy(
    col("eventGeo.countryCode").as("country"),
    col("eventDay").as("date")
  )
  .agg(
    sum(col("numArticles")).as("articles"),
  )
                                        
//Applying moving averge over weeks and of normalized number of articles.
val oilEventWeeklyCoverage = movingAverage(oilEventTemp.withColumn("normArticles",
                                                                   ($"articles"-mean_articles) / std_articles),
                                                                    7,
                                                                    "normArticles")
  .drop("normArticles")      
movingAverage: (df: org.apache.spark.sql.DataFrame, size: Int, avgOn: String)org.apache.spark.sql.DataFrame
oilEventTemp: org.apache.spark.sql.DataFrame = [country: string, date: date ... 1 more field]
mean_articles: Double = 1593.3213115310818
std_articles: Double = 10276.10388376719
oilEventWeeklyCoverage: org.apache.spark.sql.DataFrame = [country: string, date: date ... 2 more fields]

Checkpoint

oilEventWeeklyCoverage.write.parquet(rootCheckpointPath +"oil_gas_cov_norm")
val oil_gas_cov_norm = spark.read.parquet(rootCheckpointPath +"oil_gas_cov_norm")

Enrich the event data with the extracted coverage

val oilEventWeeklyCoverageC = oil_gas_cov_norm.drop($"articles").toDF("country","tempDate","coverage")
val oilEventCoverageDF = oilEventWeeklyCoverageC.join(oil_gas_eventDF,oil_gas_eventDF("eventDay") === oilEventWeeklyCoverageC("tempDate") && oil_gas_eventDF("eventGeo.countryCode")
                       === oilEventWeeklyCoverageC("country"))

Checkpoint

oilEventCoverageDF.write.parquet(rootCheckpointPath+"oil_gas_eve_cov")
val oilEventCoverageDF = spark.read.parquet(rootCheckpointPath+"oil_gas_eve_cov")

Let us look at 2018

val tot_cov_2018 = oil_gas_cov_norm.filter($"date" >"2018-01-01" && $"date"<"2018-12-31").groupBy($"date").agg(sum($"coverage").as("coverage")).orderBy(desc("coverage"))

Total Coverage

display(tot_cov_2018)
date coverage
2018-05-11 27.486972103755587
2018-05-12 23.96793050873498
2018-05-10 22.996724752589337
2018-05-09 22.386286223280464
2018-05-08 20.1775002337218
2018-05-13 18.849262389568942
2018-05-07 18.356561615113378
2018-10-20 15.61132729731101
2018-04-16 15.462361791574757
2018-12-02 15.35916815092816
2018-04-15 15.283319413257985
2018-05-06 15.074567688377858
2018-10-14 14.456182009909384
2018-12-03 14.029036688423556
2018-04-21 13.984623515510002
2018-10-15 13.913223469927237
2018-10-17 13.899999502069956
2018-03-10 13.596583766909582
2018-03-11 13.328385180680641
2018-04-22 13.313579680574394
2018-04-17 13.06945600192825
2018-10-21 13.05918925998796
2018-10-19 13.042145921984297
2018-12-01 12.686288838799623
2018-10-18 12.395778148350379
2018-12-04 12.259894218727634
2018-04-18 12.19829841568054
2018-10-16 11.886229012014399
2018-04-20 11.528415547014669
2018-03-08 11.404094561394555
2018-04-14 11.330716620584942
2018-03-09 11.287064996907633
2018-04-19 11.254357433090025
2018-04-28 10.78023000863521
2018-04-29 10.393243293127174
2018-10-13 10.336221435744559
2018-09-22 10.000035737769323
2018-06-17 9.89640767482373
2018-09-23 9.743320330587773
2018-08-05 9.629453375016213
2018-12-05 9.51893722763343
2018-10-27 9.478760481207281
2018-03-04 9.062493219829184
2018-08-19 8.696936272089726
2018-06-23 8.508148813945867
2018-05-14 8.433635112149537
2018-10-23 8.378055403769487
2018-11-04 8.361164257492085
2018-03-12 8.332151356942868
2018-06-20 8.288965286725801
2018-05-27 8.195992626048099
2018-10-28 8.172456609861717
2018-11-30 8.076037057921956
2018-04-27 7.992062718099141
2018-06-19 7.982720769955921
2018-09-24 7.928409505440399
2018-10-22 7.7411860313545
2018-04-23 7.671714791621996
2018-12-08 7.477213251581839
2018-04-24 7.470870941142693
2018-11-10 7.411391505403686
2018-03-03 7.325979536642015
2018-04-08 7.268136025226444
2018-09-25 7.248527813504637
2018-10-25 7.192913236533858
2018-05-15 7.168859614571561
2018-01-13 7.126823571685313
2018-05-05 6.906878404112484
2018-08-21 6.841289525780963
2018-11-18 6.796834601675052
2018-06-21 6.764683340770092
2018-06-24 6.596574538956665
2018-11-03 6.522762585872373
2018-05-16 6.48615923713745
2018-01-07 6.420844453924965
2018-11-05 6.417688909844269
2018-03-06 6.349945149300961
2018-08-18 6.187501294449016
2018-03-07 6.172352349287472
2018-08-20 6.13211999534919
2018-06-16 6.118537632718658
2018-11-29 6.065241569765681
2018-03-17 6.006603083752083
2018-12-06 5.996538534079764
2018-03-18 5.910026828166202
2018-11-17 5.808400635657502
2018-09-26 5.801206664345313
2018-10-26 5.779325153878007
2018-06-18 5.734887914637875
2018-05-26 5.602190842490095
2018-04-13 5.596783148609628
2018-11-06 5.576395930577082
2018-01-08 5.517962782003598
2018-06-22 5.497506327930426
2018-05-19 5.490496043971789
2018-04-25 5.466491283594766
2018-08-06 5.445391605702541
2018-09-08 5.4006446510414
2018-04-07 5.382023244868721
2018-06-09 5.3258559201817555
2018-08-04 5.245437247626126
2018-11-11 5.241172880707882
2018-08-26 5.204499498662823
2018-03-13 5.185461263940412
2018-08-07 5.135946238463144
2018-10-24 5.134761205866823
2018-01-06 5.128352030068818
2018-04-12 5.125693671750433
2018-11-09 5.1252556486967205
2018-10-06 5.09820222957644
2018-05-20 5.095244730309911
2018-07-14 5.0886378741637985
2018-01-09 5.049410466223003
2018-08-25 4.980661941915239
2018-04-05 4.9786741555894025
2018-04-30 4.819035426044705
2018-01-10 4.76593736234763
2018-05-28 4.68026580074614
2018-12-09 4.625354233483328
2018-11-08 4.6069033872765415
2018-09-02 4.595944833281466
2018-07-22 4.593393861441349
2018-10-12 4.524520721372113
2018-02-18 4.255421749359614
2018-04-11 4.15326769550994
2018-05-01 4.148203765592582
2018-09-06 4.074638491116442
2018-01-21 4.040933228369039
2018-03-05 4.028081329371769
2018-01-20 3.983184963690456
2018-02-17 3.9592179437267587
2018-12-07 3.928415210549354
2018-05-17 3.900997379661076
2018-11-07 3.781090171979379
2018-08-22 3.7540960066842066
2018-07-08 3.691214257139836
2018-11-19 3.6381439060823175
2018-07-09 3.587919930383572
2018-08-08 3.582098553125515
2018-09-09 3.5759640874830154
2018-09-27 3.569020349975437
2018-01-14 3.4918510250085406
2018-04-26 3.4455823158435064
2018-01-12 3.4339601863531612
2018-04-10 3.4318089048960854
2018-01-11 3.4028826063454725
2018-06-30 3.355114219075089
2018-10-31 3.3540417080572116
2018-03-25 3.3523733004600995
2018-05-22 3.3116791760493247
2018-04-09 3.284132718620538
2018-11-28 3.2337105614328596
2018-02-11 3.1992058723611114
2018-06-14 3.1851964256157874
2018-06-10 3.161260627929845
2018-04-06 3.110227198132983
2018-08-09 3.0843001500412646
2018-05-25 3.0016326095327823
2018-09-01 2.9837093513174255
2018-09-15 2.9179880634576563
2018-07-21 2.8933506992926987
2018-07-15 2.8759383920429187
2018-05-18 2.8735511886835856
2018-04-01 2.8681500129386905
2018-03-21 2.859885597017876
2018-08-24 2.8589575441530255
2018-05-24 2.83382654942222
2018-07-13 2.787470623222106
2018-11-16 2.7626001178609734
2018-06-03 2.75461670243995
2018-11-02 2.74905636125065
2018-11-01 2.712890648324257
2018-03-24 2.705494621418867
2018-01-19 2.6930317927735197
2018-06-15 2.6915907378300803
2018-10-29 2.686991403988603
2018-11-20 2.668421959015086
2018-03-14 2.6656798705206937
2018-07-01 2.639377107255994
2018-03-16 2.567052974029147
2018-03-31 2.4768260481588094
2018-07-12 2.4322463716166536
2018-10-30 2.4248784673751347
2018-11-23 2.4229459696536297
2018-08-23 2.40527330970895
2018-02-13 2.3758812754650664
2018-05-29 2.316169222449841
2018-10-07 2.3156859613446947
2018-08-29 2.1799793376147036
2018-11-12 2.1693098056774884
2018-09-07 2.1595089361052366
2018-04-02 2.1315521682622944
2018-11-14 2.1134659160565357
2018-11-13 2.0047810335864957
2018-03-19 1.9628808187501918
2018-02-14 1.9227148290299392
2018-11-24 1.906880362157891
2018-06-02 1.8526803126666398
2018-02-10 1.8259332815621994
2018-03-22 1.8111175258908978
2018-09-20 1.8086046133719123
2018-01-18 1.7316578551283315
2018-02-24 1.679820897111581
2018-05-21 1.666972371077739
2018-06-25 1.648684473303101
2018-11-21 1.6143364878774755
2018-08-11 1.6057134493921184
2018-09-05 1.6002086245984404
2018-01-15 1.5938763419330733
2018-03-15 1.5571475346089114
2018-02-12 1.5543185252081144
2018-02-15 1.4696386764646396
2018-01-05 1.456344971425907
2018-06-08 1.4293196749694965
2018-09-21 1.382626912966169
2018-07-10 1.3382347067653138
2018-05-23 1.306861677680712
2018-08-03 1.2780186081652762
2018-07-11 1.2738690114677595
2018-10-11 1.2464405605483622
2018-11-25 1.2250799832621464
2018-11-22 1.214306483368349
2018-08-31 1.0760382222088678
2018-05-03 1.0552549601062509
2018-03-20 1.0422082525834733
2018-01-17 1.0252444061854975
2018-07-24 1.0143208574559224
2018-10-09 1.0071266127941847
2018-09-04 1.0005788282401595
2018-11-15 0.9223148101050318
2018-02-16 0.9074606306465425
2018-08-10 0.9036341501072411
2018-02-25 0.8639406420071154
2018-01-16 0.826683836667204
2018-08-12 0.7744852475885409
2018-05-30 0.7712897297484758
2018-07-20 0.7028110841076054
2018-05-02 0.6991097660288634
2018-09-03 0.683775948999076
2018-07-29 0.6335306876798337
2018-07-23 0.6152813959254133
2018-09-16 0.6123061206765863
2018-09-28 0.4523549405630516
2018-09-30 0.4515656786872553
2018-02-27 0.41566505849738844
2018-06-29 0.3820426144303619
2018-04-03 0.370570192088862
2018-07-25 0.3567793924043352
2018-01-02 0.3212670677455316
2018-03-02 0.25908410397409654
2018-03-30 0.17316334177262105
2018-02-04 0.16616025932341172
2018-08-17 0.1629558992156963
2018-03-26 6.727633754213702e-2
2018-04-04 6.601144887406996e-2
2018-03-23 3.179181663940911e-2
2018-05-31 3.136427694204036e-2
2018-06-01 -4.39999461099605e-2
2018-03-27 -4.958289477016642e-2
2018-09-19 -5.8010942496990126e-2
2018-07-28 -0.11868598530518337
2018-02-28 -0.1254175938875366
2018-08-30 -0.13668850733493443
2018-06-12 -0.2045192799011617
2018-08-27 -0.23392175194313714
2018-10-08 -0.2850772445866747
2018-08-28 -0.47637045929834043
2018-10-05 -0.4849237606022401
2018-10-10 -0.4941231461582305
2018-02-26 -0.5065333868479032
2018-07-07 -0.5303029545852589
2018-07-03 -0.6413684311625434
2018-02-09 -0.6442009502015447
2018-01-04 -0.7244459631773861
2018-01-23 -0.8192604176824385
2018-01-03 -0.8225724376179675
2018-07-26 -0.9098762772299227
2018-12-15 -0.9122642095213793
2018-09-14 -0.9531946203004753
2018-09-10 -0.9810538927641539
2018-03-01 -1.0602110951733992
2018-01-28 -1.0608057763149794
2018-09-11 -1.0616430797274217
2018-02-03 -1.0717219413020924
2018-01-22 -1.0789334608052936
2018-06-27 -1.0891339752023268
2018-06-28 -1.130141051322766
2018-06-26 -1.137654948062775
2018-11-27 -1.1812408908959169
2018-07-02 -1.3086446749179
2018-12-16 -1.309809287500618
2018-09-29 -1.3248856236581052
2018-03-28 -1.3552367963698195
2018-10-01 -1.3870131165921822
2018-02-02 -1.3895744806721257
2018-02-08 -1.4776949758916373
2018-02-22 -1.5599837016132585
2018-12-10 -1.6813019271362846
2018-07-17 -1.74591448327502
2018-07-19 -1.7573347621843314
2018-10-02 -1.8101306308776306
2018-02-23 -1.8113157090322092
2018-02-05 -1.8461329397893045
2018-07-06 -1.8676357265862635
2018-07-04 -1.9752014850954325
2018-12-14 -1.9804298977904016
2018-07-18 -2.090632333608432
2018-06-07 -2.142892866101778
2018-06-04 -2.1431848510987486
2018-06-13 -2.1665641636390136
2018-12-19 -2.1964153243240783
2018-10-03 -2.200248596348493
2018-08-16 -2.2183974753427886
2018-05-04 -2.2567422734823293
2018-02-19 -2.260530603583992
2018-08-15 -2.504414714014903
2018-02-20 -2.5476634565787677
2018-08-14 -2.582734658569964
2018-07-27 -2.748020923559909
2018-12-18 -2.8451915413795694
2018-12-11 -2.8570741374604642
2018-07-16 -2.930993932605596
2018-12-12 -2.9849609775754398
2018-02-07 -2.9921828893184763
2018-12-13 -3.035720198357624
2018-01-31 -3.0422540354753624
2018-06-11 -3.0482803401492116
2018-06-06 -3.1153431364601376
2018-12-22 -3.128129764637562
2018-09-13 -3.14691361777108
2018-09-18 -3.1473488148909317
2018-10-04 -3.1826525202627054
2018-11-26 -3.206987780540401
2018-03-29 -3.2177095470263226
2018-09-12 -3.3188840423945587
2018-02-21 -3.3415718617878376
2018-08-01 -3.374807833645513
2018-07-05 -3.401297748343711
2018-02-06 -3.41801131537882
2018-09-17 -3.5595776490375344
2018-01-29 -3.75670988031478
2018-08-02 -3.766465397939953
2018-02-01 -3.8767280752294666
2018-01-24 -4.0353868362684135
2018-01-25 -4.099231233918816
2018-01-30 -4.132258540931211
2018-06-05 -4.204603868662037
2018-01-27 -4.225338770320211
2018-07-30 -4.340032591506121
2018-12-23 -4.496748804270663
2018-12-20 -4.530356321068381
2018-12-17 -4.685629860910394
2018-08-13 -5.158738574164472
2018-01-26 -5.786964396963983
2018-12-21 -5.938428867782927
2018-07-31 -6.001366042606405
2018-12-30 -6.048890091027438
2018-12-24 -6.303273492337473
2018-12-29 -7.5290821169025826
2018-12-25 -7.942541335592582
2018-12-26 -8.732956905335165
2018-12-28 -10.492764195593848
2018-12-27 -11.056655636490078

trendcalculusmcmodelperformance

Coverage Grouped by Country

display(oil_gas_cov_norm.filter($"date" >"2018-01-01" && $"date"<"2018-12-31").orderBy(desc("coverage")).limit(1000))
country date articles coverage
US 2018-04-19 293978.0 17.06571958044547
US 2018-03-10 121473.0 16.997419653664288
US 2018-04-20 128613.0 16.8411625445618
US 2018-04-16 121972.0 16.804795231624784
US 2018-04-18 325093.0 16.69630497544722
US 2018-04-21 94415.0 16.66300997755572
US 2018-03-11 192305.0 16.321746675576378
US 2018-04-17 198695.0 16.118084162377855
US 2018-03-12 121061.0 16.02109075933443
US 2018-03-08 167306.0 16.003087827280364
US 2018-05-10 239649.0 15.9504831207996
US 2018-03-09 172436.0 15.937985332447004
US 2018-05-11 156365.0 15.620396928759616
US 2018-01-08 125330.0 15.618951133444076
US 2018-05-09 250456.0 15.48443656158681
US 2018-01-07 75339.0 15.46596096567959
US 2018-04-15 49398.0 15.288016926844017
US 2018-01-09 135528.0 15.24454575423161
US 2018-04-22 75971.0 15.150165904258586
US 2018-03-13 254054.0 15.04659691290507
US 2018-05-08 249174.0 15.008311140799353
US 2018-01-10 269866.0 14.808527251572325
US 2018-03-14 156586.0 14.661737321411012
US 2018-12-04 143569.0 14.583942411932586
US 2018-08-19 39816.0 14.553163653965246
US 2018-08-20 147857.0 14.451221182341712
US 2018-06-20 191796.0 14.431702945581934
US 2018-05-07 122164.0 14.423876188248778
US 2018-08-21 249440.0 14.406957602681363
US 2018-11-05 120204.0 14.290807411707052
US 2018-06-21 184814.0 14.249073973848898
US 2018-03-07 205189.0 14.224467649728664
US 2018-01-11 200361.0 14.220547320123071
US 2018-12-03 152099.0 14.201279317167904
US 2018-11-07 309670.0 14.157238167556102
US 2018-11-06 122098.0 14.120787443350878
US 2018-04-24 185880.0 14.027617056766815
US 2018-11-08 194910.0 13.999187716187226
US 2018-03-04 64364.0 13.985730698250288
US 2018-04-25 216270.0 13.917486378981236
US 2018-04-27 156431.0 13.857416363996196
US 2018-08-23 137032.0 13.819353021939026
US 2018-06-23 120879.0 13.788852301532364
US 2018-12-02 91072.0 13.773893880767753
US 2018-03-03 47130.0 13.73437084151701
US 2018-06-22 218166.0 13.678457488064144
US 2018-01-12 203985.0 13.651376629653857
US 2018-04-23 105819.0 13.640894613616197
US 2018-01-13 65962.0 13.617164107812112
US 2018-11-10 64706.0 13.606612582384283
US 2018-09-25 270785.0 13.592029512326587
US 2018-08-22 224768.0 13.588415024037738
US 2018-04-26 185412.0 13.5723444530304
US 2018-05-12 74624.0 13.565184985842878
US 2018-01-19 188615.0 13.551477734101324
US 2018-03-05 125744.0 13.548697358494518
US 2018-06-19 137218.0 13.493937860918477
US 2018-11-09 144998.0 13.474364016646565
US 2018-09-24 184067.0 13.394525531097136
US 2018-06-24 51756.0 13.392064898685112
US 2018-08-24 143121.0 13.323584247489487
US 2018-01-06 97326.0 13.23302741397582
US 2018-01-20 182911.0 13.232971806463684
US 2018-03-16 102338.0 13.232457436976429
US 2018-04-28 86493.0 13.21701245048062
US 2018-03-06 177846.0 13.19096033104485
US 2018-05-06 32559.0 13.183856471369461
US 2018-09-26 159017.0 13.145681914288017
US 2018-11-04 72936.0 13.084555356572393
US 2018-01-18 160609.0 13.077229066848458
US 2018-03-20 145334.0 13.059212232916357
US 2018-09-23 65296.0 13.002534276171621
US 2018-03-22 148019.0 12.917079431896443
US 2018-01-17 157335.0 12.912283283974704
US 2018-09-27 122838.0 12.795479704732795
US 2018-06-25 90453.0 12.78376042155011
US 2018-03-21 177917.0 12.748171613782992
SA 2018-10-18 101417.0 12.684765148069788
SA 2018-10-17 194570.0 12.679829981367707
US 2018-04-14 52822.0 12.663425765287553
US 2018-12-05 175971.0 12.61896755933473
US 2018-08-18 105451.0 12.58164101681336
US 2018-03-15 145679.0 12.549889127383638
US 2018-01-05 230917.0 12.53889274185872
US 2018-12-01 198620.0 12.532331055426658
US 2018-03-02 198169.0 12.500885007313688
US 2018-02-27 89118.0 12.491028575787562
US 2018-01-04 189356.0 12.48233990201629
US 2018-01-21 44909.0 12.383150002243495
SA 2018-10-19 73494.0 12.331977189200233
US 2018-09-22 85630.0 12.31143021346594
US 2018-11-11 61567.0 12.179709919093515
US 2018-06-18 131499.0 12.142897746059406
US 2018-01-14 33044.0 12.052771870764795
US 2018-03-19 170160.0 12.050116612060297
US 2018-05-13 66083.0 12.02158995833447
US 2018-03-18 40394.0 12.01758621746067
US 2018-10-10 176220.0 12.002155132842898
US 2018-08-25 46571.0 11.99847113516388
US 2018-04-29 51144.0 11.917756831298309
US 2018-04-13 178010.0 11.870893600445594
US 2018-02-28 187108.0 11.82981355085504
US 2018-03-01 198743.0 11.80548526429549
US 2018-01-22 118502.0 11.7943359581122
US 2018-11-30 131668.0 11.764530331607242
US 2018-04-05 151628.0 11.763904747095708
US 2018-10-09 187842.0 11.761305095903344
US 2018-12-06 139691.0 11.756884298688522
US 2018-03-23 174925.0 11.754715605715216
US 2018-01-03 109245.0 11.73426594312716
US 2018-03-17 93789.0 11.721045257116797
US 2018-06-26 145159.0 11.720266751946893
US 2018-02-26 75722.0 11.688403647492896
US 2018-05-26 77976.0 11.6531901904327
US 2018-10-11 140387.0 11.62099344090589
US 2018-04-04 176372.0 11.548161501885613
US 2018-04-10 141686.0 11.541516404185346
US 2018-02-14 127538.0 11.518022230307839
US 2018-09-28 101232.0 11.507498508636077
US 2018-02-13 133936.0 11.506761709100273
US 2018-01-15 84388.0 11.500144415156077
US 2018-04-12 184872.0 11.498267661621483
US 2018-05-27 99428.0 11.497072100110556
US 2018-08-27 112195.0 11.488355622583217
US 2018-08-29 184301.0 11.448095783796672
US 2018-06-27 163254.0 11.447164357968393
US 2018-08-26 56428.0 11.435903836760827
US 2018-04-30 126325.0 11.431733273350618
US 2018-01-02 85597.0 11.41303524739485
US 2018-05-05 40375.0 11.409351249715835
US 2018-10-07 62963.0 11.407961061912431
US 2018-04-11 136299.0 11.38833161012838
US 2018-03-24 71415.0 11.37479118092324
US 2018-10-08 92961.0 11.371663258365583
US 2018-04-06 123941.0 11.30307139214568
US 2018-01-16 133067.0 11.28647254977305
US 2018-05-28 69840.0 11.266676275452593
US 2018-10-12 148567.0 11.254359211514444
US 2018-08-28 154121.0 11.245170070133952
US 2018-02-12 95901.0 11.223844589229751
US 2018-10-31 150322.0 11.222843654011301
US 2018-04-02 128317.0 11.218923324405706
US 2018-02-15 211996.0 11.193886042066419
US 2018-04-03 171774.0 11.180095379056663
US 2018-11-01 108141.0 11.171921074772655
US 2018-11-12 82452.0 11.163816279878814
US 2018-11-02 157228.0 11.159993263419455
US 2018-10-28 76599.0 11.156031228179758
US 2018-11-17 67482.0 11.151429706550493
SA 2018-10-16 232742.0 11.147328652530456
US 2018-06-28 141057.0 11.13551205620153
US 2018-06-29 141666.0 11.115076295491507
US 2018-05-24 146696.0 11.10189731511525
US 2018-10-30 143007.0 11.096322662023605
US 2018-08-30 140805.0 11.092749879368858
US 2018-10-27 52983.0 11.036808722159924
US 2018-10-29 121062.0 11.020446211713873
US 2018-06-17 64893.0 11.008448890970508
US 2018-11-16 112630.0 10.980547821756213
US 2018-08-06 129756.0 10.961307622557113
US 2018-02-25 66114.0 10.947350137010949
US 2018-09-07 95556.0 10.937368588582514
US 2018-04-01 25971.0 10.898387722575098
US 2018-04-07 63841.0 10.88479168585782
US 2018-11-03 62084.0 10.869318895605945
US 2018-05-25 177910.0 10.849327994993013
US 2018-05-01 139814.0 10.846380796849797
US 2018-02-01 140101.0 10.839874717929872
US 2018-02-16 139981.0 10.837261164859473
US 2018-08-07 158735.0 10.81875776519618
US 2018-02-03 80910.0 10.807830889061433
US 2018-12-07 159194.0 10.804369321430963
US 2018-09-03 41958.0 10.80388275569977
US 2018-04-09 95168.0 10.789855760763434
US 2018-11-14 207029.0 10.752431905095829
US 2018-10-06 48236.0 10.747343817735375
US 2018-09-02 30867.0 10.738543928939833
US 2018-01-31 173319.0 10.726921958903386
US 2018-06-30 101234.0 10.718636439595103
US 2018-11-13 131611.0 10.713840291673364
SA 2018-10-20 128817.0 10.709919962067769
US 2018-08-05 31112.0 10.692181165696349
US 2018-10-26 151770.0 10.69042952906406
SA 2018-10-15 141823.0 10.667699958478423
US 2018-03-25 52544.0 10.650322610935886
US 2018-02-18 26847.0 10.644177980844844
US 2018-09-06 145505.0 10.623491986330208
US 2018-02-17 80161.0 10.621587429039549
US 2018-08-09 137140.0 10.604668843472135
US 2018-02-02 177181.0 10.596633557968469
US 2018-05-14 98420.0 10.596522342944194
US 2018-01-23 110156.0 10.596007973456935
US 2018-12-08 57274.0 10.56952489580211
US 2018-02-23 140434.0 10.549186448238325
US 2018-08-17 146305.0 10.511957218863193
US 2018-05-29 157439.0 10.508398338086483
US 2018-09-08 67881.0 10.495636414051246
US 2018-11-29 108948.0 10.482360120528751
US 2018-09-05 214833.0 10.479148786702885
US 2018-02-22 145437.0 10.473087567880048
US 2018-10-25 117894.0 10.464134758426136
US 2018-08-08 174538.0 10.464093052792034
US 2018-11-18 33272.0 10.4360946704315
US 2018-10-23 118091.0 10.407623624217809
US 2018-09-04 168347.0 10.38582547946045
US 2018-03-26 86548.0 10.379958886930089
US 2018-05-15 101337.0 10.376942179396707
US 2018-05-23 120110.0 10.36708574787058
US 2018-11-15 121834.0 10.35907826612298
US 2018-08-10 94133.0 10.342326503091973
US 2018-04-08 41490.0 10.327701727400177
US 2018-09-01 61168.0 10.31409178880486
US 2018-05-16 139421.0 10.262515821298615
US 2018-02-04 40178.0 10.239786250712976
US 2018-05-30 108880.0 10.23472596710859
US 2018-05-22 99615.0 10.206060294602423
US 2018-05-02 122811.0 10.205253985676448
US 2018-01-30 104797.0 10.204836929335428
US 2018-02-10 79351.0 10.195953629271683
US 2018-03-31 37365.0 10.195397554150322
US 2018-10-13 65561.0 10.157973698482717
US 2018-02-05 56912.0 10.142973572084
US 2018-02-24 94693.0 10.14180581432914
US 2018-02-11 50163.0 10.12755638934426
US 2018-08-31 125628.0 10.116323671892767
US 2018-09-21 87025.0 10.073464181913854
US 2018-08-15 82951.0 10.017550828460989
US 2018-02-09 119630.0 10.00206413633108
US 2018-08-16 144365.0 9.997976984189078
US 2018-05-17 137140.0 9.99126237709864
US 2018-10-14 35545.0 9.96765698819686
US 2018-05-03 150451.0 9.94688758241402
US 2018-02-21 129163.0 9.927202523117833
US 2018-05-04 114325.0 9.889041867914427
US 2018-05-18 140570.0 9.84651602300833
US 2018-09-14 123342.0 9.836381553921525
US 2018-05-19 66393.0 9.822576989033735
US 2018-10-24 141746.0 9.798165291205978
SA 2018-10-21 50740.0 9.754541197935195
US 2018-11-19 127156.0 9.733215717030996
US 2018-10-20 96823.0 9.728767116060103
US 2018-02-20 118422.0 9.725180431527326
US 2018-02-19 70248.0 9.71888288077791
US 2018-06-16 53423.0 9.718146081242109
US 2018-03-27 118005.0 9.7100134825922
US 2018-10-05 120539.0 9.6986556482384
US 2018-05-21 88008.0 9.68696416881178
US 2018-10-04 142998.0 9.677218952309927
US 2018-10-22 104784.0 9.618970083347346
US 2018-10-19 95040.0 9.60205149777993
US 2018-12-09 29060.0 9.587315507063863
US 2018-11-28 86662.0 9.577403468025599
SA 2018-10-14 50385.0 9.574067017297429
US 2018-12-13 109585.0 9.55767670309531
US 2018-05-20 46571.0 9.554117822318602
US 2018-08-04 63962.0 9.5381445644575
US 2018-08-11 38450.0 9.53361255221841
US 2018-10-15 66588.0 9.47582244523095
US 2018-09-15 48527.0 9.46422827895057
US 2018-02-08 133137.0 9.460043813662326
US 2018-03-30 126734.0 9.447907474138619
US 2018-06-01 123365.0 9.444543219654383
US 2018-10-21 28694.0 9.439830483000849
US 2018-10-02 112407.0 9.432949053374003
US 2018-09-09 41250.0 9.423885028895816
US 2018-07-01 29338.0 9.423551383822998
US 2018-07-02 88983.0 9.381150655819212
US 2018-10-03 128700.0 9.359449824208093
US 2018-02-06 119989.0 9.342906589347596
US 2018-12-14 105697.0 9.339903783692248
US 2018-02-07 132458.0 9.321233561492546
US 2018-05-31 130123.0 9.318536597153946
US 2018-09-29 53523.0 9.305746869362638
US 2018-06-04 78904.0 9.298045228931786
US 2018-07-03 116642.0 9.256464711732006
US 2018-03-28 125804.0 9.236654535533516
US 2018-07-23 107114.0 9.206348441419332
US 2018-06-03 33524.0 9.203484654544322
US 2018-12-18 63736.0 9.169313838336679
US 2018-12-10 83582.0 9.168785566971385
US 2018-10-17 162530.0 9.166297130803295
US 2018-01-29 74409.0 9.165463018121253
US 2018-10-01 91419.0 9.164545494171007
US 2018-09-11 136572.0 9.15999958005388
US 2018-09-13 98737.0 9.135031807104763
US 2018-09-20 94641.0 9.1131085454451
US 2018-08-14 100562.0 9.086111098303014
US 2018-12-12 105318.0 9.07701927006876
US 2018-10-18 105008.0 9.07105536439216
US 2018-09-16 58808.0 9.068928377052954
US 2018-07-22 53621.0 9.03136550260501
US 2018-09-19 109304.0 9.02291316076032
US 2018-09-17 114986.0 9.011986284625573
US 2018-11-21 155573.0 8.964136020432443
US 2018-07-24 120368.0 8.961021999752822
US 2018-11-20 143903.0 8.948148860693312
US 2018-07-25 96753.0 8.903259696521433
US 2018-09-12 137739.0 8.89094263258328
US 2018-09-30 40105.0 8.884283633004982
SA 2018-10-22 116446.0 8.871146358262825
US 2018-03-29 128571.0 8.867239930535263
US 2018-11-22 71274.0 8.862707918296168
US 2018-01-28 32053.0 8.861748688711822
US 2018-07-21 47326.0 8.843829167925959
US 2018-06-15 120982.0 8.803902974212228
US 2018-07-26 108648.0 8.789778666129655
US 2018-09-10 64536.0 8.773721997000349
US 2018-06-02 58290.0 8.771469892758835
US 2018-10-16 108976.0 8.73169661970348
US 2018-08-03 104387.0 8.658030568001163
US 2018-01-24 96205.0 8.655917482539987
US 2018-06-05 109023.0 8.612918973780738
US 2018-12-17 67917.0 8.544869280804166
US 2018-09-18 109802.0 8.507111780063742
US 2018-01-25 118254.0 8.477194938534513
US 2018-12-15 104170.0 8.464919580230465
US 2018-12-11 126676.0 8.425076797784937
SA 2018-10-23 116063.0 8.419627261595597
US 2018-06-08 74082.0 8.414149921650191
US 2018-06-07 136925.0 8.39927491215378
US 2018-06-06 139956.0 8.382578756634912
US 2018-12-19 95718.0 8.377184827957707
US 2018-08-13 110885.0 8.360822317511655
US 2018-07-27 121914.0 8.333560734686925
US 2018-12-16 63635.0 8.331461551103787
US 2018-07-04 70095.0 8.32111855384647
US 2018-07-08 21576.0 8.270515717802605
US 2018-08-12 41224.0 8.260381248715797
US 2018-07-05 138007.0 8.213212176546335
US 2018-07-20 139561.0 8.141047527671692
US 2018-06-14 103210.0 8.087608708508885
US 2018-08-01 91525.0 8.086482656388128
US 2018-07-30 74297.0 8.041106926485059
US 2018-07-28 43171.0 7.986820092762175
US 2018-07-09 49654.0 7.96962346963408
US 2018-11-23 56158.0 7.927292251020463
US 2018-07-29 45458.0 7.914141074400271
US 2018-06-09 41721.0 7.891870265789757
US 2018-08-02 117781.0 7.887046314111948
US 2018-01-26 102416.0 7.864219430380073
US 2018-07-31 95426.0 7.797448710182633
US 2018-01-27 43355.0 7.7897192659957115
US 2018-11-27 88339.0 7.770326146138241
US 2018-12-20 124936.0 7.756841324445233
US 2018-06-13 98981.0 7.668216851978299
US 2018-07-06 132697.0 7.66646521534601
IR 2018-05-10 98317.0 7.607938308822747
IR 2018-05-11 42017.0 7.539958125236347
IR 2018-05-09 196921.0 7.532895971195059
US 2018-06-12 71454.0 7.505537075224091
IR 2018-05-08 170920.0 7.434123127763284
US 2018-07-07 33952.0 7.421361203728045
US 2018-07-10 99011.0 7.3869679574718585
US 2018-07-19 96061.0 7.366003925396543
US 2018-07-11 131177.0 7.326383572999561
US 2018-06-10 34725.0 7.322240813345419
SA 2018-10-13 18580.0 7.281967072680836
US 2018-07-13 90785.0 7.214014692850504
US 2018-07-12 116363.0 7.190256383290348
US 2018-11-24 68632.0 7.154848299937678
US 2018-12-21 150615.0 7.150024348259869
IR 2018-05-07 20107.0 7.117785893098955
US 2018-06-11 79974.0 6.853538995428132
US 2018-07-14 29594.0 6.808135461768994
US 2018-07-18 83263.0 6.784391054086874
SA 2018-10-24 125847.0 6.782917455015266
US 2018-11-26 59869.0 6.7205953357887145
US 2018-12-22 47190.0 6.635710468512932
US 2018-07-17 69815.0 6.537882952787471
IR 2018-09-25 74769.0 6.407858687535197
IR 2018-09-24 49741.0 6.326894149865013
SA 2018-10-25 37872.0 6.225048991387717
US 2018-11-25 25976.0 6.196855982734706
US 2018-07-15 11784.0 6.14204087764653
IR 2018-09-23 109693.0 6.0511921046941515
IR 2018-05-06 8554.0 5.899091657123839
US 2018-07-16 51363.0 5.859804949799675
US 2018-12-23 19012.0 5.682639416134008
IR 2018-05-12 16179.0 5.45491275205858
IR 2018-09-22 126435.0 5.316311028059311
IR 2018-09-26 62879.0 4.890885756461957
SA 2018-10-26 41015.0 4.7543693141677945
US 2018-12-24 24267.0 4.737603649258753
SY 2018-04-14 77719.0 4.541837402783558
TU 2018-10-19 32942.0 4.493472769103172
TU 2018-10-17 87405.0 4.406919676463308
SA 2018-10-12 38993.0 4.402749113053098
IR 2018-09-21 6514.0 4.39340705101423
TU 2018-10-18 31166.0 4.386261485704741
US 2018-12-30 23279.0 4.318462026532781
SY 2018-04-15 92616.0 4.209123755795143
TU 2018-10-20 32608.0 4.161412510382355
SY 2018-04-16 51381.0 4.067310697970013
TU 2018-10-16 72217.0 4.017722699022632
IS 2018-05-12 2255.0 3.939816574519934
SY 2018-04-13 12182.0 3.928722875848778
SY 2018-04-17 51599.0 3.9177820978359974
IS 2018-05-13 1307.0 3.8505943212975358
IR 2018-09-20 16401.0 3.830172462465547
KN 2018-05-11 9007.0 3.8239305192282673
KN 2018-05-10 151876.0 3.7756910024501873
KN 2018-05-09 99867.0 3.7698522136758945
TU 2018-10-15 49492.0 3.7396851383420526
SY 2018-04-12 21443.0 3.725936180966398
KN 2018-05-08 12105.0 3.6731924557052915
KN 2018-05-12 7621.0 3.6677012138818506
US 2018-12-29 31458.0 3.6640728237149687
US 2018-12-28 50325.0 3.6340725709175348
IR 2018-08-06 61148.0 3.6273301600710304
KN 2018-05-07 329.0 3.6256341309508784
IR 2018-08-07 100863.0 3.560489930483419
RS 2018-07-19 40550.0 3.538052299336496
IR 2018-08-05 15094.0 3.51967401657551
RS 2018-07-18 28985.0 3.5130011151191765
RS 2018-07-17 68773.0 3.509470038098533
IR 2018-08-08 53752.0 3.5018379070578507
IR 2018-09-27 36233.0 3.4635243311960666
TU 2018-10-21 20839.0 3.4264063168452097
TU 2018-10-14 22325.0 3.426142181162563
IR 2018-08-09 21172.0 3.357981273161719
US 2018-12-25 26740.0 3.3433843012259885
IR 2018-05-05 9074.0 3.317318279912184
SA 2018-10-27 11084.0 3.3093803075547537
IR 2018-11-05 75534.0 3.2770028336135
IR 2018-11-04 32874.0 3.276127015297356
RS 2018-07-14 8640.0 3.1989020828183254
TU 2018-10-22 57204.0 3.1911865405094395
US 2018-12-27 56957.0 3.1839992695658466
IR 2018-11-03 29708.0 3.1837768395173023
IR 2018-09-28 12338.0 3.1835822132248253
TU 2018-10-23 48331.0 3.1767702929881514
IS 2018-05-11 8186.0 3.168234539875258
IR 2018-05-13 13952.0 3.149286280114876
US 2018-12-26 27161.0 3.1246799559946448
RS 2018-07-16 58266.0 3.084837173549118
RS 2018-07-15 6153.0 3.0744802744137663
RS 2018-03-16 25299.0 3.044549531006502
IR 2018-08-04 10215.0 3.038460508427597
IR 2018-07-24 34917.0 2.9866204052387033
RS 2018-07-20 52232.0 2.978765844149476
IR 2018-07-25 13467.0 2.9717175919862244
IS 2018-05-14 132441.0 2.9480426936942727
IR 2018-07-23 121138.0 2.9291222376899593
SA 2018-11-20 41960.0 2.915665219753019
KN 2018-03-08 8525.0 2.8903916054871543
RS 2018-03-18 66413.0 2.8767816668918407
SY 2018-04-18 6987.0 2.8668140203414416
RS 2018-03-19 61502.0 2.8591540855446915
SA 2018-11-21 61467.0 2.8581670522042755
IS 2018-05-15 62384.0 2.8523004596739154
FR 2018-12-03 26729.0 2.848922303311646
FR 2018-12-04 22382.0 2.8355904022770124
IS 2018-05-16 8920.0 2.828778482040338
KN 2018-03-07 10783.0 2.8284587388455553
IS 2018-05-17 7721.0 2.8191861861968586
IR 2018-07-22 21615.0 2.8127913223012047
RS 2018-04-13 28932.0 2.7998486738515243
IR 2018-11-06 37292.0 2.79624808744071
TU 2018-10-24 34534.0 2.786405557792618
RS 2018-03-17 11413.0 2.745992798347696
IR 2018-07-26 14868.0 2.7445887086662597
IR 2018-11-02 42823.0 2.7344820433355195
RS 2018-04-14 31685.0 2.733786949433818
IR 2018-07-21 8979.0 2.726933323563042
SA 2018-10-11 22749.0 2.7074845961934355
SA 2018-11-19 16766.0 2.6906077162601245
SA 2018-11-22 40463.0 2.679694742003412
RS 2018-03-20 14935.0 2.664319264897776
SY 2018-04-11 30920.0 2.6611079310719146
KN 2018-03-09 101156.0 2.646177314063368
IR 2018-08-10 5025.0 2.6344302271246134
RS 2018-04-15 15941.0 2.5899720211717887
TU 2018-10-25 14246.0 2.56675588485496
CH 2018-03-26 15479.0 2.5665056510503477
SA 2018-11-23 26002.0 2.553187651893748
RS 2018-03-21 24279.0 2.545305287048454
KN 2018-05-13 1944.0 2.491560626568898
IR 2018-11-07 15431.0 2.4846513931859855
CH 2018-03-25 19450.0 2.466189699156795
CH 2018-03-27 25021.0 2.458418549335772
IR 2018-07-20 6869.0 2.4519958816840512
FR 2018-12-02 23278.0 2.4396093083557315
CH 2018-03-28 101895.0 2.4361755444813262
KN 2018-03-06 36844.0 2.4332005425820435
CH 2018-03-30 6122.0 2.425054042054103
KN 2018-03-10 32100.0 2.4148778673331934
IR 2018-09-19 10017.0 2.4123060198968984
RS 2018-04-12 38212.0 2.39581839254854
RS 2018-02-14 15800.0 2.390118622554588
TU 2018-10-13 4612.0 2.379900742199577
UK 2018-04-18 56380.0 2.3646364801182136
RS 2018-03-15 14247.0 2.361161010609706
AR 2018-11-30 48495.0 2.3373887991715168
SA 2018-11-18 20211.0 2.333454567687886
RS 2018-07-13 21687.0 2.3163135520719282
UK 2018-04-21 13971.0 2.308097542153817
KN 2018-03-11 4607.0 2.3030650623054987
KN 2018-06-12 53681.0 2.284380938227764
RS 2018-04-16 35940.0 2.2754976381640195
UK 2018-04-20 21037.0 2.274454997311467
IR 2018-09-29 17315.0 2.26625288927139
KN 2018-06-13 21236.0 2.264765388321749
AR 2018-12-01 36929.0 2.2638756681275716
KN 2018-06-11 71286.0 2.2607894512040168
CH 2018-03-31 2621.0 2.2375594130091545
UK 2018-04-17 16163.0 2.2356965613525945
RS 2018-04-11 30356.0 2.2337781021838987
KN 2018-06-10 16032.0 2.2238104556334997
UK 2018-04-19 32276.0 2.2161783245928177
CH 2018-03-29 15806.0 2.2137454959368625
KN 2018-03-12 7485.0 2.2016091564131552
AR 2018-12-02 48505.0 2.183091854871829
SA 2018-11-24 9880.0 2.1829111304573865
IR 2018-08-03 9833.0 2.166298386206722
AR 2018-11-29 13730.0 2.154745925560444
RS 2018-07-21 8894.0 2.146557719398401
RS 2018-04-17 26736.0 2.131696611780024
IR 2018-11-08 13215.0 2.1220070027903057
UK 2018-07-11 27825.0 2.107090287659793
SA 2018-10-10 29693.0 2.1045184402234973
UK 2018-04-16 23182.0 2.103823346321796
RS 2018-04-09 6877.0 2.098401613888524
RS 2018-12-02 5691.0 2.0927574514067087
UK 2018-07-12 38075.0 2.088545182362398
UK 2018-07-10 24389.0 2.060449486855625
KN 2018-06-14 5682.0 2.0519276356207654
RS 2018-02-15 22046.0 2.0507181722318046
IR 2018-05-14 15217.0 2.042947022410783
FR 2018-12-01 70292.0 2.0375252899775114
UK 2018-05-14 13079.0 2.0163666316097197
AR 2018-12-03 15700.0 2.007302607131533
UK 2018-05-11 42485.0 1.9899391614670308
FR 2018-12-05 34985.0 1.9873534121527012
RS 2018-02-13 15796.0 1.9761484984572735
UK 2018-05-16 26616.0 1.9717416031204864
IR 2018-01-02 25965.0 1.9606618063273653
UK 2018-05-15 13218.0 1.9470379658540171
UK 2018-05-17 44767.0 1.944243688369177
KN 2018-06-09 2175.0 1.9374873756446391
UK 2018-07-13 41539.0 1.9361388934753383
RS 2018-04-10 31488.0 1.9333307141124645
UK 2018-05-18 37498.0 1.9209580426621784
RS 2018-02-16 43709.0 1.9187476440547682
UK 2018-04-15 18239.0 1.9088773106506076
UK 2018-06-19 33154.0 1.9045677284600584
UK 2018-05-19 9638.0 1.8946973950558978
SA 2018-10-09 25632.0 1.8912080236693567
FR 2018-12-06 33466.0 1.8872737921857266
UK 2018-05-10 32097.0 1.8818103541183535
IR 2018-05-18 12730.0 1.8800448156080312
IR 2018-09-09 17329.0 1.8762217991486734
RS 2018-12-03 7632.0 1.8636266976498435
CA 2018-08-08 20157.0 1.8594700361176686
IR 2018-09-10 6322.0 1.8592615079471582
TU 2018-10-26 31905.0 1.8554245896097663
RS 2018-12-01 21909.0 1.8538953830260227
CA 2018-08-09 20022.0 1.8528805459295388
UK 2018-06-17 15941.0 1.8499194459082908
SA 2018-10-28 10611.0 1.8495162914453036
UK 2018-05-13 8170.0 1.8402298369185723
FR 2018-11-30 4952.0 1.8362399979228063
IR 2018-09-11 28622.0 1.8351695533141859
CA 2018-08-07 16160.0 1.8335291317061704
IR 2018-01-04 14518.0 1.8334457204379664
UK 2018-06-20 20877.0 1.8305124241727861
RS 2018-11-30 54768.0 1.8242148734233707
RS 2018-07-22 7955.0 1.8189460616484745
RS 2018-02-17 36154.0 1.8171388175040502
IR 2018-09-30 7019.0 1.8123982770944465
IR 2018-11-01 13152.0 1.8119951226314597
UK 2018-05-09 19946.0 1.80440469722488
RS 2018-11-29 30313.0 1.8014296953255975
CA 2018-12-05 19699.0 1.7984963990604175
IR 2018-01-03 39705.0 1.7984963990604173
FR 2018-11-12 15471.0 1.7952016539663525
UK 2018-06-18 23572.0 1.7947706957472975
IR 2018-05-19 12641.0 1.7812858740542894
IR 2018-09-08 58230.0 1.7794647280318316
CA 2018-12-06 32143.0 1.7773238388145916
SA 2018-08-20 17978.0 1.7772126237903192
CA 2018-06-09 16115.0 1.7692468476768202
RS 2018-04-18 20011.0 1.7652292049249863
RS 2018-03-14 14871.0 1.754135506253831
FR 2018-11-11 49234.0 1.7502568822823368
RS 2018-03-22 12979.0 1.7489779095032063
CH 2018-04-05 15074.0 1.74785185738245
RS 2018-02-18 6168.0 1.7476294273339055
UK 2018-05-12 7861.0 1.7475043104315995
ID 2018-10-31 25051.0 1.7457665756773455
UK 2018-06-16 18227.0 1.7452522061900866
UK 2018-05-20 6192.0 1.7444180935080447
ID 2018-11-01 3857.0 1.7443346822398405
SA 2018-08-16 1506.0 1.7423328118029404
RS 2018-11-28 9878.0 1.7415126009989326
UK 2018-05-08 30657.0 1.739163183611182
SA 2018-08-21 5519.0 1.7389685573187055
UK 2018-07-14 6163.0 1.7280555830619928
RS 2018-04-01 7520.0 1.72587298821065
ID 2018-10-30 21182.0 1.7245662116754514
CA 2018-12-07 37891.0 1.7241908609685326
CH 2018-04-04 55902.0 1.72014541446063
CH 2018-04-03 11534.0 1.71907496985201
CA 2018-06-10 20236.0 1.7165726318058847
UK 2018-04-22 7560.0 1.709218538325883
CH 2018-04-07 2698.0 1.708926598887169
CA 2018-04-24 31092.0 1.7049645636474706
CA 2018-12-04 16089.0 1.7045475073064495
BE 2018-07-12 45304.0 1.6975965682894352
BE 2018-07-11 63508.0 1.6848207423761625
CA 2018-08-30 32008.0 1.6814981935260296
RS 2018-04-08 4285.0 1.6806084733318514
ID 2018-10-29 81121.0 1.6759235404343842
UK 2018-06-21 9841.0 1.6759235404343842
FR 2018-11-10 31170.0 1.6752145446546487
RS 2018-03-31 8229.0 1.670571317391283
RS 2018-03-13 36411.0 1.669208933343948
CA 2018-12-08 11951.0 1.6688752882711313
BE 2018-07-10 12444.0 1.6625638356436823
CA 2018-08-06 61635.0 1.6617297229616403
FR 2018-11-28 6062.0 1.656683341235288
RS 2018-04-04 19788.0 1.6560577567237567
IZ 2018-05-10 1888.0 1.6546397651642857
RS 2018-05-08 15439.0 1.652290347776535
FR 2018-12-07 3993.0 1.6509696693633023
RS 2018-04-06 40806.0 1.6503440848517705
FR 2018-11-13 18592.0 1.6492736402431507
IZ 2018-05-11 12134.0 1.647980765585986
CH 2018-12-03 15953.0 1.644602609223717
RS 2018-05-09 45093.0 1.6387360166933564
IR 2018-05-15 20937.0 1.6358027204281762
BE 2018-07-09 7253.0 1.635121528404509
SA 2018-08-17 6684.0 1.6334811067964934
RS 2018-02-26 14538.0 1.630923161238232
RS 2018-04-03 39879.0 1.6304226936290074
CH 2018-11-30 15171.0 1.6300473429220883
RS 2018-04-02 4053.0 1.6294634640446592
IZ 2018-05-12 88160.0 1.6277535330464736
ID 2018-10-28 186.0 1.627405986095623
CA 2018-08-29 11360.0 1.6258072701217097
RS 2018-05-10 30477.0 1.6227210531981553
CA 2018-12-09 4267.0 1.6219703517843178
CH 2018-12-02 35823.0 1.616910068179931
CA 2018-04-25 17322.0 1.6163122874244678
RS 2018-05-07 16782.0 1.6134067949153559
CH 2018-04-06 15551.0 1.6119609995998165
TU 2018-10-12 12942.0 1.6118219808194765
RS 2018-04-05 8159.0 1.6110851812836728
BE 2018-07-13 2186.0 1.6104456948941075
RS 2018-07-12 39805.0 1.6099313254068484
RS 2018-02-19 9501.0 1.6077070249214038
IR 2018-08-02 13428.0 1.6048293361683599
CA 2018-06-08 25083.0 1.6033001295846165
SY 2018-04-19 11242.0 1.602090666195656
IR 2018-01-05 12007.0 1.5995049168813267
CH 2018-04-01 3450.0 1.5981703365900601
CH 2018-12-04 7477.0 1.5969052656889633
SY 2018-04-10 7496.0 1.5960294473728194
SA 2018-08-19 37751.0 1.5951119234225735
SA 2018-08-18 53071.0 1.5941109882041236
CA 2018-06-11 17851.0 1.5905104017933098
UK 2018-07-09 21538.0 1.5881748862835932
CH 2018-04-02 30679.0 1.5879941618691509
IZ 2018-05-13 12415.0 1.5872573623333472
IR 2018-05-16 31071.0 1.5866178759437817
CA 2018-09-02 7408.0 1.5862564271148971
IZ 2018-09-07 13050.0 1.585658646359434
CA 2018-04-26 18877.0 1.5770116782222678
IZ 2018-05-14 7622.0 1.5745788495663129
RS 2018-04-19 15591.0 1.5735501105917948
RS 2018-02-24 10084.0 1.5718401795936092
CA 2018-08-31 42147.0 1.5701997579855937
FR 2018-11-29 4023.0 1.5689485889625314
RS 2018-03-09 15082.0 1.5658345682829087
UK 2018-04-25 13301.0 1.5613025560438152
IR 2018-05-22 13833.0 1.553600915612963
IR 2018-05-20 11321.0 1.5513210076153825
RS 2018-05-11 11097.0 1.550125446104456
IR 2018-05-17 18735.0 1.5500420348362514
RS 2018-07-11 37935.0 1.5493886465686526
SA 2018-11-17 14016.0 1.5488186695692572
IZ 2018-09-08 48602.0 1.5475814024242285
FR 2018-11-14 7486.0 1.5465248596936425
CH 2018-12-01 23734.0 1.544773223061355
IR 2018-08-29 21165.0 1.5437305822088025
RS 2018-03-28 15810.0 1.5413811648210516
IR 2018-08-30 10478.0 1.5351253197057386
CH 2018-05-20 18729.0 1.5346804596086496
KN 2018-05-06 1524.0 1.5345553427063436
IR 2018-09-06 10386.0 1.5345553427063436
RS 2018-04-07 10073.0 1.5336934262682338
SA 2018-10-29 10656.0 1.5321920234405586
CH 2018-11-29 14561.0 1.529564568492127
RS 2018-02-23 25545.0 1.5293421384435828
RS 2018-02-25 39728.0 1.5282438900788942
RS 2018-03-12 12344.0 1.5271734454702741
CA 2018-04-23 15267.0 1.521821222427173
RS 2018-03-29 15094.0 1.5189852393082308
CA 2018-08-28 16504.0 1.5153290453852815
CA 2018-04-27 32303.0 1.5119230852669443
CA 2018-04-21 11152.0 1.5117979683646379
IR 2018-09-07 10204.0 1.5110472669508004
UK 2018-04-13 11551.0 1.5107692293901198
SY 2018-04-09 36794.0 1.5062233152729922
CA 2018-09-01 11791.0 1.506167707760856
UK 2018-06-15 18644.0 1.5053892025909505
IR 2018-08-28 26057.0 1.504054622299684
RS 2018-03-10 5304.0 1.5039156035193437
SY 2018-12-22 10096.0 1.5002733114744282
UK 2018-04-26 15228.0 1.4996477269628967
IR 2018-10-01 29604.0 1.497479033989588
IR 2018-05-03 10653.0 1.4965337062832742
CA 2018-06-13 8809.0 1.494948892187395
CA 2018-04-22 7783.0 1.4947681677729523
IR 2018-05-21 38955.0 1.4941286813833872
CH 2018-05-19 6233.0 1.4908339362893224
FR 2018-11-09 12853.0 1.4904724874604376
RS 2018-03-11 22748.0 1.4887903602183201
SA 2018-11-25 7373.0 1.488387205755333
UK 2018-07-15 1859.0 1.4880535606825167
IZ 2018-09-06 14122.0 1.485440007612119
IZ 2018-05-09 4305.0 1.485203675685541
FR 2018-12-08 9276.0 1.4834798428093212
AR 2018-11-28 8399.0 1.4811026216655023
CH 2018-05-11 4308.0 1.4802963127395283
RS 2018-07-10 5286.0 1.4784890685951046
CH 2018-05-17 33464.0 1.4749301878183931
RS 2018-03-08 11486.0 1.47333147184448
IR 2018-07-29 5277.0 1.4725807704306422
UK 2018-04-23 27374.0 1.4722193216017576
UK 2018-06-22 26542.0 1.4708847413104909
IR 2018-08-01 19137.0 1.4683545995082972
CA 2018-08-05 3423.0 1.4673953699239495
SY 2018-12-21 51717.0 1.467020019217031
RS 2018-03-27 21497.0 1.466630766632078
RS 2018-02-12 18994.0 1.4644064661466332
IR 2018-01-10 18383.0 1.461473169881453
RS 2018-11-27 12183.0 1.4595825144688253
RS 2018-02-22 11981.0 1.4593183787861788
IR 2018-01-11 12379.0 1.4535490994020566
IR 2018-05-23 14529.0 1.4533127674754784
SY 2018-12-23 2846.0 1.4528679073783894
IR 2018-07-30 20760.0 1.4525620660616405
FR 2018-11-27 7903.0 1.449809494210903
IR 2018-05-02 11207.0 1.4486278345780104
CA 2018-06-07 27936.0 1.4457362439469321
UK 2018-04-12 18253.0 1.4456389308006943
RS 2018-07-24 8910.0 1.4452774819718093
BE 2018-07-14 1617.0 1.4449716406550608
CH 2018-05-18 17601.0 1.440314511513661
RS 2018-07-23 18035.0 1.4389521274663264
IR 2018-07-31 38126.0 1.4362690650057586
IZ 2018-05-15 1718.0 1.4274691762102183
UK 2018-01-14 3048.0 1.425314385114944
RS 2018-02-27 11544.0 1.4217416024601985
CA 2018-10-15 15977.0 1.421588681801824
UK 2018-06-14 13808.0 1.4206433540955101
IR 2018-08-27 41619.0 1.4179602916349427
IR 2018-07-04 19242.0 1.4178768803667385
FR 2018-11-26 7502.0 1.4166257113436758
RS 2018-03-30 40737.0 1.4150269953697623
SA 2018-12-03 8704.0 1.4106757075451113
RS 2018-05-12 3757.0 1.4094940479122189
SN 2018-06-11 38071.0 1.407214139914638
IR 2018-10-31 8788.0 1.4044059605517643
IR 2018-08-26 5418.0 1.4041418248691178
SA 2018-10-08 19878.0 1.4039889042107434
FR 2018-11-22 2050.0 1.4029740671142592
SN 2018-06-10 22899.0 1.3943966083672632
CA 2018-06-12 18601.0 1.3943827064892294
IR 2018-07-28 7907.0 1.3937571219776983
FR 2018-11-23 2565.0 1.3933539675147113
IR 2018-08-25 4286.0 1.3908238257125178
FR 2018-11-24 55411.0 1.3900592224206465
FR 2018-11-25 29589.0 1.3891973059825367
SA 2018-12-04 21457.0 1.3871815336676023
IR 2018-10-02 8783.0 1.3833168115741423
UK 2018-01-15 13569.0 1.3828858533550874
UK 2018-04-14 4696.0 1.3786874861888108
CA 2018-10-16 24171.0 1.3774224152877141
RS 2018-07-25 5419.0 1.3735993988283561
CH 2018-05-07 10919.0 1.3719728790983745
UK 2018-04-24 18583.0 1.3671489274205662
IR 2018-07-05 17456.0 1.3656614264709255
IR 2018-06-07 9686.0 1.3656197208368235
RS 2018-07-09 7454.0 1.363478831619583
IS 2018-05-09 15338.0 1.362922756498222
CA 2018-06-14 18868.0 1.3608930823052536
IR 2018-09-05 8062.0 1.3603648109399606
IS 2018-05-10 72644.0 1.3594333851116807
UK 2018-01-13 12563.0 1.3590997400388638
CA 2018-09-03 2884.0 1.3566530095048748
RS 2018-02-20 8487.0 1.3551933123113018
SA 2018-12-02 12480.0 1.3533304606547418
IS 2018-05-08 6882.0 1.3532470493865376
CA 2018-12-10 13139.0 1.3519124690952709
CA 2018-10-17 33541.0 1.3517178428027943
CA 2018-10-14 10781.0 1.3498688930242686
IR 2018-05-01 28455.0 1.3492989160248734
IR 2018-07-27 11005.0 1.349145995366499
CH 2018-05-06 2251.0 1.3471302230515647
AR 2018-12-04 2242.0 1.3466019516862722
SY 2018-04-08 16020.0 1.342459192032131
IR 2018-05-24 14621.0 1.3422089582275187
CA 2018-12-03 16961.0 1.3412775323992387
CA 2018-01-18 13738.0 1.3391088394259298
CH 2018-07-08 2205.0 1.3379688854271399
IR 2018-05-04 19262.0 1.3367872257942472
UK 2018-04-10 25664.0 1.3359392112341717
IR 2018-01-06 6834.0 1.3340485558215438
CA 2018-06-17 10593.0 1.3338400276510332
IR 2018-06-08 6583.0 1.3332144431395019
IR 2018-07-03 21279.0 1.33217180228695
UK 2018-05-07 5301.0 1.330003109313641
VE 2018-05-25 1774.0 1.3277788088281965
FR 2018-05-11 12034.0 1.326569345439236
SN 2018-06-12 31123.0 1.3251374520017312
FR 2018-11-08 2248.0 1.3230799740526948
CH 2018-05-21 22177.0 1.32178709939553
CA 2018-04-20 19129.0 1.3207583604210118
TU 2018-10-27 4528.0 1.3195349951540176
IR 2018-07-06 15913.0 1.3192152519592348
IR 2018-08-11 5996.0 1.3185618636916352
UK 2018-04-09 18497.0 1.3161985444258504
IZ 2018-09-09 15579.0 1.3148361603785157
CH 2018-05-16 8081.0 1.3134876782092146
RS 2018-12-07 14087.0 1.312625761771105
IR 2018-01-12 19851.0 1.311833354723165
CA 2018-01-19 18253.0 1.3110548495532597
IR 2018-06-05 31939.0 1.3101234237249797
IS 2018-05-07 2329.0 1.3096090542377208
UK 2018-04-11 18242.0 1.3050214344864912
CH 2018-12-05 14743.0 1.3037146579512924
RS 2018-12-08 19621.0 1.302046432587209
VE 2018-05-26 14188.0 1.3010315954907252
IZ 2018-09-05 18357.0 1.301017693612691
RS 2018-02-28 7664.0 1.3009898898566228
UK 2018-05-21 11404.0 1.2999750527601388
SN 2018-06-09 2957.0 1.2989602156636546
CA 2018-08-04 3257.0 1.2985570612006678
RS 2018-12-04 14318.0 1.2980843973475107
CH 2018-07-09 6810.0 1.2980009860793065
SY 2018-12-20 30894.0 1.2926487630362058
RS 2018-07-26 13216.0 1.2913975940131428
SN 2018-06-13 7445.0 1.2907442057455434
CH 2018-05-10 19231.0 1.2891732935276983
ID 2018-10-27 53.0 1.2841130099233116
IR 2018-06-06 42002.0 1.2839878930210051
CH 2018-05-15 8474.0 1.2837932667285288
UK 2018-04-27 13479.0 1.2823474714129899
CA 2018-04-19 16931.0 1.281388241828642
SA 2018-12-01 18669.0 1.277050855882025
UK 2018-04-08 8124.0 1.2756606680786218
UK 2018-06-13 13348.0 1.2738534239341983
RS 2018-02-11 30582.0 1.273797816422062
RS 2018-12-06 13831.0 1.2664576248200947
RS 2018-12-05 27060.0 1.2662769004056522
IR 2018-06-04 8450.0 1.266054470357108
VE 2018-05-24 2234.0 1.265595708381985
IR 2018-01-13 31617.0 1.2649284182363516
CA 2018-01-17 22089.0 1.2645113618953305
UK 2018-07-08 3193.0 1.2640525999202077
CA 2018-04-18 18547.0 1.258450143072494
CA 2018-05-31 17206.0 1.257351894707806
UK 2018-03-19 10345.0 1.2568931327326829
CA 2018-05-30 16372.0 1.2558504918801305
CH 2018-06-18 11364.0 1.2555446505633818
CA 2018-06-18 8726.0 1.2549190660518508
SY 2018-12-19 7622.0 1.2543768928085235
SA 2018-08-15 3313.0 1.2514714002994116
UK 2018-04-28 27937.0 1.2506233857393358
CA 2018-10-18 13234.0 1.2487744359608102
CA 2018-04-17 17350.0 1.245396279598541
RS 2018-05-06 6387.0 1.2431580772350623
CA 2018-06-01 18011.0 1.2414203424808083
CH 2018-05-22 12108.0 1.2403498978721885
UK 2018-04-29 3125.0 1.2380421861185393
CA 2018-07-25 18416.0 1.237277582826668
UK 2018-03-04 5144.0 1.236777115217443
CA 2018-07-24 18424.0 1.2365407832908646
CH 2018-08-05 3110.0 1.235915198779333
FR 2018-05-10 3506.0 1.2357205724868567
IR 2018-11-09 8241.0 1.2352062029995976
KN 2018-06-08 3686.0 1.2351227917313936
CA 2018-04-28 4775.0 1.2350810860972914
KN 2018-05-27 4184.0 1.2345806184880666
CH 2018-05-08 34641.0 1.2341496602690116
CA 2018-06-15 10975.0 1.2340384452447393
UK 2018-04-30 11743.0 1.2263090010578193
CA 2018-01-16 25882.0 1.226281197301751
CA 2018-05-29 33358.0 1.225669514668254
UK 2018-02-07 18936.0 1.2238205648897278
CH 2018-05-09 23575.0 1.2215406568921474
UK 2018-05-01 16301.0 1.2203868010153227
IR 2018-07-02 16038.0 1.219052220724056
CA 2018-07-26 6261.0 1.2161050225808423
CH 2018-05-23 11235.0 1.2149650685820517
UK 2018-11-22 11056.0 1.2136721939248871
UK 2018-03-18 11517.0 1.212365417389688
UK 2018-04-07 6920.0 1.2121290854631095
UK 2018-03-09 14277.0 1.211920557292599
IR 2018-06-03 2185.0 1.2110586408544894
CA 2018-06-06 12598.0 1.2098352755874946
IR 2018-08-24 4128.0 1.2078751107846968
VE 2018-05-27 59244.0 1.2072078206390633
UK 2018-06-23 12900.0 1.2061234741524092
IR 2018-07-07 14391.0 1.2040520943253388
UK 2018-03-08 16241.0 1.201480246889043
SA 2018-10-07 7012.0 1.2006461342070016
UK 2018-02-04 14635.0 1.1999510403053004
CH 2018-03-24 4221.0 1.199408867061973
UK 2018-02-06 15372.0 1.1989362032088162
VE 2018-05-28 9163.0 1.1985608525018974
IR 2018-10-03 30232.0 1.1971845665765284
UK 2018-03-21 22001.0 1.193597882043749
KN 2018-05-28 18718.0 1.1935005688975107
KN 2018-07-05 10469.0 1.1933198444830684
UK 2018-03-20 23703.0 1.1931947275807622
CH 2018-06-17 3934.0 1.192054773581972
SA 2018-11-30 14191.0 1.1918045397773593
CA 2018-06-19 13254.0 1.1911650533877938
RS 2018-07-29 2042.0 1.191095543997624
CA 2018-07-22 2925.0 1.190692389534637
KN 2018-04-21 24575.0 1.1871057050018574
CA 2018-10-19 8756.0 1.1822539495679811
CH 2018-03-08 8583.0 1.180655233594068
VE 2018-05-29 9601.0 1.1801547659848433
RS 2018-03-07 15959.0 1.1797377096438222
SA 2018-11-29 16006.0 1.1773743903780372
CA 2018-07-23 41474.0 1.172925789407148
RS 2018-03-04 1629.0 1.1712714659210985
UK 2018-02-05 19498.0 1.170048100654104
UK 2018-03-05 10324.0 1.1689359504113814
UK 2018-03-10 6580.0 1.167740388900455
SY 2018-12-24 13111.0 1.1651546395861259
CA 2018-08-03 6032.0 1.1647653870011732
SA 2018-12-05 16995.0 1.1642510175139138
UK 2018-07-16 10575.0 1.1642232137578457
IR 2018-12-04 11543.0 1.1634725123440082
IR 2018-10-30 4973.0 1.1627079090521366
CH 2018-03-09 22473.0 1.1626383996619665
SA 2018-08-07 12130.0 1.1618459926140268
CA 2018-06-16 23349.0 1.1597051033967862
RS 2018-07-30 3423.0 1.1596633977626845
UK 2018-06-12 15900.0 1.157522508545444
UK 2018-11-21 20627.0 1.1571749615945932
RS 2018-07-08 1798.0 1.1559376944495647
CA 2018-04-15 6133.0 1.1524483230630231
CA 2018-04-16 12435.0 1.152378813672853
CA 2018-05-27 3159.0 1.150891312723212
SY 2018-04-07 1105.0 1.150488158260225
IR 2018-10-04 13580.0 1.1499459850168983
SA 2018-11-15 14772.0 1.1488199328961417
SA 2018-11-26 7666.0 1.1483889746770866
FR 2018-11-21 6124.0 1.146998786873684
UK 2018-03-07 20636.0 1.1469431793615477
RS 2018-03-24 2852.0 1.1455112859240426
IR 2018-07-08 5070.0 1.1437040417796192
IR 2018-04-30 31598.0 1.142703106561169
UK 2018-05-22 11329.0 1.1408958624167451
UK 2018-03-23 8327.0 1.138935697613947
UK 2018-11-24 9398.0 1.1382684074683136
UK 2018-01-12 31543.0 1.1382684074683136
IR 2018-07-01 8826.0 1.137517706054476
FR 2018-12-09 16079.0 1.1353768168372356
SA 2018-11-16 9813.0 1.1333888482783696
SA 2018-08-09 11301.0 1.133124712595723
UK 2018-03-06 20454.0 1.132165483011375
UK 2018-02-03 4853.0 1.1320542679871026
CA 2018-12-02 5790.0 1.1320125623530006
IZ 2018-09-10 8923.0 1.131748426670354
RS 2018-04-20 18588.0 1.1270217881387843
SA 2018-12-06 20131.0 1.1267020449440017
RS 2018-03-06 40865.0 1.125436974042905
SY 2018-12-18 394.0 1.1228234209725076
RS 2018-05-23 12048.0 1.1220171120465339
CA 2018-03-08 13220.0 1.1185277406599927
IR 2018-01-09 6870.0 1.116942926564113
CH 2018-05-12 4097.0 1.1165258702230925
CH 2018-06-19 20359.0 1.1149688598832812
UK 2018-03-22 13776.0 1.1141486490792734
CH 2018-08-04 6976.0 1.113940120908763
RS 2018-05-22 5681.0 1.1139262190307289
IR 2018-09-12 15022.0 1.113523064567742
SA 2018-08-08 11386.0 1.1131477138608235
CA 2018-04-12 16936.0 1.1124387180810877
KN 2018-07-04 1450.0 1.1116185072770801
CA 2018-04-14 10213.0 1.1106453758146981
RS 2018-02-10 6376.0 1.1104646514002556
CA 2018-01-15 7031.0 1.1093107955234314
UK 2018-03-17 7314.0 1.108170841524641
UK 2018-11-03 6218.0 1.1078232945737903
SA 2018-12-07 12501.0 1.10675284996517
SA 2018-11-28 11508.0 1.1063774992582514
CH 2018-05-05 5004.0 1.1054738771860393
CA 2018-04-29 4956.0 1.1048343907964742
SY 2018-04-06 5722.0 1.1040280818705004
UK 2018-11-05 18989.0 1.1017203701168514
IR 2018-04-28 1929.0 1.100497004849857
CA 2018-03-09 19018.0 1.1002189672891765
CA 2018-08-10 18390.0 1.10009385038687

coveragebycountry

Coverage without USA

display(oil_gas_cov_norm.filter($"date" >"2018-01-01" && $"date"<"2018-12-31" && $"country" =!="US").orderBy(desc("coverage")).limit(1000))
country date articles coverage
SA 2018-10-18 101417.0 12.684765148069788
SA 2018-10-17 194570.0 12.679829981367707
SA 2018-10-19 73494.0 12.331977189200233
SA 2018-10-16 232742.0 11.147328652530456
SA 2018-10-20 128817.0 10.709919962067769
SA 2018-10-15 141823.0 10.667699958478423
SA 2018-10-21 50740.0 9.754541197935195
SA 2018-10-14 50385.0 9.574067017297429
SA 2018-10-22 116446.0 8.871146358262825
SA 2018-10-23 116063.0 8.419627261595597
IR 2018-05-10 98317.0 7.607938308822747
IR 2018-05-11 42017.0 7.539958125236347
IR 2018-05-09 196921.0 7.532895971195059
IR 2018-05-08 170920.0 7.434123127763284
SA 2018-10-13 18580.0 7.281967072680836
IR 2018-05-07 20107.0 7.117785893098955
SA 2018-10-24 125847.0 6.782917455015266
IR 2018-09-25 74769.0 6.407858687535197
IR 2018-09-24 49741.0 6.326894149865013
SA 2018-10-25 37872.0 6.225048991387717
IR 2018-09-23 109693.0 6.0511921046941515
IR 2018-05-06 8554.0 5.899091657123839
IR 2018-05-12 16179.0 5.45491275205858
IR 2018-09-22 126435.0 5.316311028059311
IR 2018-09-26 62879.0 4.890885756461957
SA 2018-10-26 41015.0 4.7543693141677945
SY 2018-04-14 77719.0 4.541837402783558
TU 2018-10-19 32942.0 4.493472769103172
TU 2018-10-17 87405.0 4.406919676463308
SA 2018-10-12 38993.0 4.402749113053098
IR 2018-09-21 6514.0 4.39340705101423
TU 2018-10-18 31166.0 4.386261485704741
SY 2018-04-15 92616.0 4.209123755795143
TU 2018-10-20 32608.0 4.161412510382355
SY 2018-04-16 51381.0 4.067310697970013
TU 2018-10-16 72217.0 4.017722699022632
IS 2018-05-12 2255.0 3.939816574519934
SY 2018-04-13 12182.0 3.928722875848778
SY 2018-04-17 51599.0 3.9177820978359974
IS 2018-05-13 1307.0 3.8505943212975358
IR 2018-09-20 16401.0 3.830172462465547
KN 2018-05-11 9007.0 3.8239305192282673
KN 2018-05-10 151876.0 3.7756910024501873
KN 2018-05-09 99867.0 3.7698522136758945
TU 2018-10-15 49492.0 3.7396851383420526
SY 2018-04-12 21443.0 3.725936180966398
KN 2018-05-08 12105.0 3.6731924557052915
KN 2018-05-12 7621.0 3.6677012138818506
IR 2018-08-06 61148.0 3.6273301600710304
KN 2018-05-07 329.0 3.6256341309508784
IR 2018-08-07 100863.0 3.560489930483419
RS 2018-07-19 40550.0 3.538052299336496
IR 2018-08-05 15094.0 3.51967401657551
RS 2018-07-18 28985.0 3.5130011151191765
RS 2018-07-17 68773.0 3.509470038098533
IR 2018-08-08 53752.0 3.5018379070578507
IR 2018-09-27 36233.0 3.4635243311960666
TU 2018-10-21 20839.0 3.4264063168452097
TU 2018-10-14 22325.0 3.426142181162563
IR 2018-08-09 21172.0 3.357981273161719
IR 2018-05-05 9074.0 3.317318279912184
SA 2018-10-27 11084.0 3.3093803075547537
IR 2018-11-05 75534.0 3.2770028336135
IR 2018-11-04 32874.0 3.276127015297356
RS 2018-07-14 8640.0 3.1989020828183254
TU 2018-10-22 57204.0 3.1911865405094395
IR 2018-11-03 29708.0 3.1837768395173023
IR 2018-09-28 12338.0 3.1835822132248253
TU 2018-10-23 48331.0 3.1767702929881514
IS 2018-05-11 8186.0 3.168234539875258
IR 2018-05-13 13952.0 3.149286280114876
RS 2018-07-16 58266.0 3.084837173549118
RS 2018-07-15 6153.0 3.0744802744137663
RS 2018-03-16 25299.0 3.044549531006502
IR 2018-08-04 10215.0 3.038460508427597
IR 2018-07-24 34917.0 2.9866204052387033
RS 2018-07-20 52232.0 2.978765844149476
IR 2018-07-25 13467.0 2.9717175919862244
IS 2018-05-14 132441.0 2.9480426936942727
IR 2018-07-23 121138.0 2.9291222376899593
SA 2018-11-20 41960.0 2.915665219753019
KN 2018-03-08 8525.0 2.8903916054871543
RS 2018-03-18 66413.0 2.8767816668918407
SY 2018-04-18 6987.0 2.8668140203414416
RS 2018-03-19 61502.0 2.8591540855446915
SA 2018-11-21 61467.0 2.8581670522042755
IS 2018-05-15 62384.0 2.8523004596739154
FR 2018-12-03 26729.0 2.848922303311646
FR 2018-12-04 22382.0 2.8355904022770124
IS 2018-05-16 8920.0 2.828778482040338
KN 2018-03-07 10783.0 2.8284587388455553
IS 2018-05-17 7721.0 2.8191861861968586
IR 2018-07-22 21615.0 2.8127913223012047
RS 2018-04-13 28932.0 2.7998486738515243
IR 2018-11-06 37292.0 2.79624808744071
TU 2018-10-24 34534.0 2.786405557792618
RS 2018-03-17 11413.0 2.745992798347696
IR 2018-07-26 14868.0 2.7445887086662597
IR 2018-11-02 42823.0 2.7344820433355195
RS 2018-04-14 31685.0 2.733786949433818
IR 2018-07-21 8979.0 2.726933323563042
SA 2018-10-11 22749.0 2.7074845961934355
SA 2018-11-19 16766.0 2.6906077162601245
SA 2018-11-22 40463.0 2.679694742003412
RS 2018-03-20 14935.0 2.664319264897776
SY 2018-04-11 30920.0 2.6611079310719146
KN 2018-03-09 101156.0 2.646177314063368
IR 2018-08-10 5025.0 2.6344302271246134
RS 2018-04-15 15941.0 2.5899720211717887
TU 2018-10-25 14246.0 2.56675588485496
CH 2018-03-26 15479.0 2.5665056510503477
SA 2018-11-23 26002.0 2.553187651893748
RS 2018-03-21 24279.0 2.545305287048454
KN 2018-05-13 1944.0 2.491560626568898
IR 2018-11-07 15431.0 2.4846513931859855
CH 2018-03-25 19450.0 2.466189699156795
CH 2018-03-27 25021.0 2.458418549335772
IR 2018-07-20 6869.0 2.4519958816840512
FR 2018-12-02 23278.0 2.4396093083557315
CH 2018-03-28 101895.0 2.4361755444813262
KN 2018-03-06 36844.0 2.4332005425820435
CH 2018-03-30 6122.0 2.425054042054103
KN 2018-03-10 32100.0 2.4148778673331934
IR 2018-09-19 10017.0 2.4123060198968984
RS 2018-04-12 38212.0 2.39581839254854
RS 2018-02-14 15800.0 2.390118622554588
TU 2018-10-13 4612.0 2.379900742199577
UK 2018-04-18 56380.0 2.3646364801182136
RS 2018-03-15 14247.0 2.361161010609706
AR 2018-11-30 48495.0 2.3373887991715168
SA 2018-11-18 20211.0 2.333454567687886
RS 2018-07-13 21687.0 2.3163135520719282
UK 2018-04-21 13971.0 2.308097542153817
KN 2018-03-11 4607.0 2.3030650623054987
KN 2018-06-12 53681.0 2.284380938227764
RS 2018-04-16 35940.0 2.2754976381640195
UK 2018-04-20 21037.0 2.274454997311467
IR 2018-09-29 17315.0 2.26625288927139
KN 2018-06-13 21236.0 2.264765388321749
AR 2018-12-01 36929.0 2.2638756681275716
KN 2018-06-11 71286.0 2.2607894512040168
CH 2018-03-31 2621.0 2.2375594130091545
UK 2018-04-17 16163.0 2.2356965613525945
RS 2018-04-11 30356.0 2.2337781021838987
KN 2018-06-10 16032.0 2.2238104556334997
UK 2018-04-19 32276.0 2.2161783245928177
CH 2018-03-29 15806.0 2.2137454959368625
KN 2018-03-12 7485.0 2.2016091564131552
AR 2018-12-02 48505.0 2.183091854871829
SA 2018-11-24 9880.0 2.1829111304573865
IR 2018-08-03 9833.0 2.166298386206722
AR 2018-11-29 13730.0 2.154745925560444
RS 2018-07-21 8894.0 2.146557719398401
RS 2018-04-17 26736.0 2.131696611780024
IR 2018-11-08 13215.0 2.1220070027903057
UK 2018-07-11 27825.0 2.107090287659793
SA 2018-10-10 29693.0 2.1045184402234973
UK 2018-04-16 23182.0 2.103823346321796
RS 2018-04-09 6877.0 2.098401613888524
RS 2018-12-02 5691.0 2.0927574514067087
UK 2018-07-12 38075.0 2.088545182362398
UK 2018-07-10 24389.0 2.060449486855625
KN 2018-06-14 5682.0 2.0519276356207654
RS 2018-02-15 22046.0 2.0507181722318046
IR 2018-05-14 15217.0 2.042947022410783
FR 2018-12-01 70292.0 2.0375252899775114
UK 2018-05-14 13079.0 2.0163666316097197
AR 2018-12-03 15700.0 2.007302607131533
UK 2018-05-11 42485.0 1.9899391614670308
FR 2018-12-05 34985.0 1.9873534121527012
RS 2018-02-13 15796.0 1.9761484984572735
UK 2018-05-16 26616.0 1.9717416031204864
IR 2018-01-02 25965.0 1.9606618063273653
UK 2018-05-15 13218.0 1.9470379658540171
UK 2018-05-17 44767.0 1.944243688369177
KN 2018-06-09 2175.0 1.9374873756446391
UK 2018-07-13 41539.0 1.9361388934753383
RS 2018-04-10 31488.0 1.9333307141124645
UK 2018-05-18 37498.0 1.9209580426621784
RS 2018-02-16 43709.0 1.9187476440547682
UK 2018-04-15 18239.0 1.9088773106506076
UK 2018-06-19 33154.0 1.9045677284600584
UK 2018-05-19 9638.0 1.8946973950558978
SA 2018-10-09 25632.0 1.8912080236693567
FR 2018-12-06 33466.0 1.8872737921857266
UK 2018-05-10 32097.0 1.8818103541183535
IR 2018-05-18 12730.0 1.8800448156080312
IR 2018-09-09 17329.0 1.8762217991486734
RS 2018-12-03 7632.0 1.8636266976498435
CA 2018-08-08 20157.0 1.8594700361176686
IR 2018-09-10 6322.0 1.8592615079471582
TU 2018-10-26 31905.0 1.8554245896097663
RS 2018-12-01 21909.0 1.8538953830260227
CA 2018-08-09 20022.0 1.8528805459295388
UK 2018-06-17 15941.0 1.8499194459082908
SA 2018-10-28 10611.0 1.8495162914453036
UK 2018-05-13 8170.0 1.8402298369185723
FR 2018-11-30 4952.0 1.8362399979228063
IR 2018-09-11 28622.0 1.8351695533141859
CA 2018-08-07 16160.0 1.8335291317061704
IR 2018-01-04 14518.0 1.8334457204379664
UK 2018-06-20 20877.0 1.8305124241727861
RS 2018-11-30 54768.0 1.8242148734233707
RS 2018-07-22 7955.0 1.8189460616484745
RS 2018-02-17 36154.0 1.8171388175040502
IR 2018-09-30 7019.0 1.8123982770944465
IR 2018-11-01 13152.0 1.8119951226314597
UK 2018-05-09 19946.0 1.80440469722488
RS 2018-11-29 30313.0 1.8014296953255975
CA 2018-12-05 19699.0 1.7984963990604175
IR 2018-01-03 39705.0 1.7984963990604173
FR 2018-11-12 15471.0 1.7952016539663525
UK 2018-06-18 23572.0 1.7947706957472975
IR 2018-05-19 12641.0 1.7812858740542894
IR 2018-09-08 58230.0 1.7794647280318316
CA 2018-12-06 32143.0 1.7773238388145916
SA 2018-08-20 17978.0 1.7772126237903192
CA 2018-06-09 16115.0 1.7692468476768202
RS 2018-04-18 20011.0 1.7652292049249863
RS 2018-03-14 14871.0 1.754135506253831
FR 2018-11-11 49234.0 1.7502568822823368
RS 2018-03-22 12979.0 1.7489779095032063
CH 2018-04-05 15074.0 1.74785185738245
RS 2018-02-18 6168.0 1.7476294273339055
UK 2018-05-12 7861.0 1.7475043104315995
ID 2018-10-31 25051.0 1.7457665756773455
UK 2018-06-16 18227.0 1.7452522061900866
UK 2018-05-20 6192.0 1.7444180935080447
ID 2018-11-01 3857.0 1.7443346822398405
SA 2018-08-16 1506.0 1.7423328118029404
RS 2018-11-28 9878.0 1.7415126009989326
UK 2018-05-08 30657.0 1.739163183611182
SA 2018-08-21 5519.0 1.7389685573187055
UK 2018-07-14 6163.0 1.7280555830619928
RS 2018-04-01 7520.0 1.72587298821065
ID 2018-10-30 21182.0 1.7245662116754514
CA 2018-12-07 37891.0 1.7241908609685326
CH 2018-04-04 55902.0 1.72014541446063
CH 2018-04-03 11534.0 1.71907496985201
CA 2018-06-10 20236.0 1.7165726318058847
UK 2018-04-22 7560.0 1.709218538325883
CH 2018-04-07 2698.0 1.708926598887169
CA 2018-04-24 31092.0 1.7049645636474706
CA 2018-12-04 16089.0 1.7045475073064495
BE 2018-07-12 45304.0 1.6975965682894352
BE 2018-07-11 63508.0 1.6848207423761625
CA 2018-08-30 32008.0 1.6814981935260296
RS 2018-04-08 4285.0 1.6806084733318514
ID 2018-10-29 81121.0 1.6759235404343842
UK 2018-06-21 9841.0 1.6759235404343842
FR 2018-11-10 31170.0 1.6752145446546487
RS 2018-03-31 8229.0 1.670571317391283
RS 2018-03-13 36411.0 1.669208933343948
CA 2018-12-08 11951.0 1.6688752882711313
BE 2018-07-10 12444.0 1.6625638356436823
CA 2018-08-06 61635.0 1.6617297229616403
FR 2018-11-28 6062.0 1.656683341235288
RS 2018-04-04 19788.0 1.6560577567237567
IZ 2018-05-10 1888.0 1.6546397651642857
RS 2018-05-08 15439.0 1.652290347776535
FR 2018-12-07 3993.0 1.6509696693633023
RS 2018-04-06 40806.0 1.6503440848517705
FR 2018-11-13 18592.0 1.6492736402431507
IZ 2018-05-11 12134.0 1.647980765585986
CH 2018-12-03 15953.0 1.644602609223717
RS 2018-05-09 45093.0 1.6387360166933564
IR 2018-05-15 20937.0 1.6358027204281762
BE 2018-07-09 7253.0 1.635121528404509
SA 2018-08-17 6684.0 1.6334811067964934
RS 2018-02-26 14538.0 1.630923161238232
RS 2018-04-03 39879.0 1.6304226936290074
CH 2018-11-30 15171.0 1.6300473429220883
RS 2018-04-02 4053.0 1.6294634640446592
IZ 2018-05-12 88160.0 1.6277535330464736
ID 2018-10-28 186.0 1.627405986095623
CA 2018-08-29 11360.0 1.6258072701217097
RS 2018-05-10 30477.0 1.6227210531981553
CA 2018-12-09 4267.0 1.6219703517843178
CH 2018-12-02 35823.0 1.616910068179931
CA 2018-04-25 17322.0 1.6163122874244678
RS 2018-05-07 16782.0 1.6134067949153559
CH 2018-04-06 15551.0 1.6119609995998165
TU 2018-10-12 12942.0 1.6118219808194765
RS 2018-04-05 8159.0 1.6110851812836728
BE 2018-07-13 2186.0 1.6104456948941075
RS 2018-07-12 39805.0 1.6099313254068484
RS 2018-02-19 9501.0 1.6077070249214038
IR 2018-08-02 13428.0 1.6048293361683599
CA 2018-06-08 25083.0 1.6033001295846165
SY 2018-04-19 11242.0 1.602090666195656
IR 2018-01-05 12007.0 1.5995049168813267
CH 2018-04-01 3450.0 1.5981703365900601
CH 2018-12-04 7477.0 1.5969052656889633
SY 2018-04-10 7496.0 1.5960294473728194
SA 2018-08-19 37751.0 1.5951119234225735
SA 2018-08-18 53071.0 1.5941109882041236
CA 2018-06-11 17851.0 1.5905104017933098
UK 2018-07-09 21538.0 1.5881748862835932
CH 2018-04-02 30679.0 1.5879941618691509
IZ 2018-05-13 12415.0 1.5872573623333472
IR 2018-05-16 31071.0 1.5866178759437817
CA 2018-09-02 7408.0 1.5862564271148971
IZ 2018-09-07 13050.0 1.585658646359434
CA 2018-04-26 18877.0 1.5770116782222678
IZ 2018-05-14 7622.0 1.5745788495663129
RS 2018-04-19 15591.0 1.5735501105917948
RS 2018-02-24 10084.0 1.5718401795936092
CA 2018-08-31 42147.0 1.5701997579855937
FR 2018-11-29 4023.0 1.5689485889625314
RS 2018-03-09 15082.0 1.5658345682829087
UK 2018-04-25 13301.0 1.5613025560438152
IR 2018-05-22 13833.0 1.553600915612963
IR 2018-05-20 11321.0 1.5513210076153825
RS 2018-05-11 11097.0 1.550125446104456
IR 2018-05-17 18735.0 1.5500420348362514
RS 2018-07-11 37935.0 1.5493886465686526
SA 2018-11-17 14016.0 1.5488186695692572
IZ 2018-09-08 48602.0 1.5475814024242285
FR 2018-11-14 7486.0 1.5465248596936425
CH 2018-12-01 23734.0 1.544773223061355
IR 2018-08-29 21165.0 1.5437305822088025
RS 2018-03-28 15810.0 1.5413811648210516
IR 2018-08-30 10478.0 1.5351253197057386
CH 2018-05-20 18729.0 1.5346804596086496
IR 2018-09-06 10386.0 1.5345553427063436
KN 2018-05-06 1524.0 1.5345553427063436
RS 2018-04-07 10073.0 1.5336934262682338
SA 2018-10-29 10656.0 1.5321920234405586
CH 2018-11-29 14561.0 1.529564568492127
RS 2018-02-23 25545.0 1.5293421384435828
RS 2018-02-25 39728.0 1.5282438900788942
RS 2018-03-12 12344.0 1.5271734454702741
CA 2018-04-23 15267.0 1.521821222427173
RS 2018-03-29 15094.0 1.5189852393082308
CA 2018-08-28 16504.0 1.5153290453852815
CA 2018-04-27 32303.0 1.5119230852669443
CA 2018-04-21 11152.0 1.5117979683646379
IR 2018-09-07 10204.0 1.5110472669508004
UK 2018-04-13 11551.0 1.5107692293901198
SY 2018-04-09 36794.0 1.5062233152729922
CA 2018-09-01 11791.0 1.506167707760856
UK 2018-06-15 18644.0 1.5053892025909505
IR 2018-08-28 26057.0 1.504054622299684
RS 2018-03-10 5304.0 1.5039156035193437
SY 2018-12-22 10096.0 1.5002733114744282
UK 2018-04-26 15228.0 1.4996477269628967
IR 2018-10-01 29604.0 1.497479033989588
IR 2018-05-03 10653.0 1.4965337062832742
CA 2018-06-13 8809.0 1.494948892187395
CA 2018-04-22 7783.0 1.4947681677729523
IR 2018-05-21 38955.0 1.4941286813833872
CH 2018-05-19 6233.0 1.4908339362893224
FR 2018-11-09 12853.0 1.4904724874604376
RS 2018-03-11 22748.0 1.4887903602183201
SA 2018-11-25 7373.0 1.488387205755333
UK 2018-07-15 1859.0 1.4880535606825167
IZ 2018-09-06 14122.0 1.485440007612119
IZ 2018-05-09 4305.0 1.485203675685541
FR 2018-12-08 9276.0 1.4834798428093212
AR 2018-11-28 8399.0 1.4811026216655023
CH 2018-05-11 4308.0 1.4802963127395283
RS 2018-07-10 5286.0 1.4784890685951046
CH 2018-05-17 33464.0 1.4749301878183931
RS 2018-03-08 11486.0 1.47333147184448
IR 2018-07-29 5277.0 1.4725807704306422
UK 2018-04-23 27374.0 1.4722193216017576
UK 2018-06-22 26542.0 1.4708847413104909
IR 2018-08-01 19137.0 1.4683545995082972
CA 2018-08-05 3423.0 1.4673953699239495
SY 2018-12-21 51717.0 1.467020019217031
RS 2018-03-27 21497.0 1.466630766632078
RS 2018-02-12 18994.0 1.4644064661466332
IR 2018-01-10 18383.0 1.461473169881453
RS 2018-11-27 12183.0 1.4595825144688253
RS 2018-02-22 11981.0 1.4593183787861788
IR 2018-01-11 12379.0 1.4535490994020566
IR 2018-05-23 14529.0 1.4533127674754784
SY 2018-12-23 2846.0 1.4528679073783894
IR 2018-07-30 20760.0 1.4525620660616405
FR 2018-11-27 7903.0 1.449809494210903
IR 2018-05-02 11207.0 1.4486278345780104
CA 2018-06-07 27936.0 1.4457362439469321
UK 2018-04-12 18253.0 1.4456389308006943
RS 2018-07-24 8910.0 1.4452774819718093
BE 2018-07-14 1617.0 1.4449716406550608
CH 2018-05-18 17601.0 1.440314511513661
RS 2018-07-23 18035.0 1.4389521274663264
IR 2018-07-31 38126.0 1.4362690650057586
IZ 2018-05-15 1718.0 1.4274691762102183
UK 2018-01-14 3048.0 1.425314385114944
RS 2018-02-27 11544.0 1.4217416024601985
CA 2018-10-15 15977.0 1.421588681801824
UK 2018-06-14 13808.0 1.4206433540955101
IR 2018-08-27 41619.0 1.4179602916349427
IR 2018-07-04 19242.0 1.4178768803667385
FR 2018-11-26 7502.0 1.4166257113436758
RS 2018-03-30 40737.0 1.4150269953697623
SA 2018-12-03 8704.0 1.4106757075451113
RS 2018-05-12 3757.0 1.4094940479122189
SN 2018-06-11 38071.0 1.407214139914638
IR 2018-10-31 8788.0 1.4044059605517643
IR 2018-08-26 5418.0 1.4041418248691178
SA 2018-10-08 19878.0 1.4039889042107434
FR 2018-11-22 2050.0 1.4029740671142592
SN 2018-06-10 22899.0 1.3943966083672632
CA 2018-06-12 18601.0 1.3943827064892294
IR 2018-07-28 7907.0 1.3937571219776983
FR 2018-11-23 2565.0 1.3933539675147113
IR 2018-08-25 4286.0 1.3908238257125178
FR 2018-11-24 55411.0 1.3900592224206465
FR 2018-11-25 29589.0 1.3891973059825367
SA 2018-12-04 21457.0 1.3871815336676023
IR 2018-10-02 8783.0 1.3833168115741423
UK 2018-01-15 13569.0 1.3828858533550874
UK 2018-04-14 4696.0 1.3786874861888108
CA 2018-10-16 24171.0 1.3774224152877141
RS 2018-07-25 5419.0 1.3735993988283561
CH 2018-05-07 10919.0 1.3719728790983745
UK 2018-04-24 18583.0 1.3671489274205662
IR 2018-07-05 17456.0 1.3656614264709255
IR 2018-06-07 9686.0 1.3656197208368235
RS 2018-07-09 7454.0 1.363478831619583
IS 2018-05-09 15338.0 1.362922756498222
CA 2018-06-14 18868.0 1.3608930823052536
IR 2018-09-05 8062.0 1.3603648109399606
IS 2018-05-10 72644.0 1.3594333851116807
UK 2018-01-13 12563.0 1.3590997400388638
CA 2018-09-03 2884.0 1.3566530095048748
RS 2018-02-20 8487.0 1.3551933123113018
SA 2018-12-02 12480.0 1.3533304606547418
IS 2018-05-08 6882.0 1.3532470493865376
CA 2018-12-10 13139.0 1.3519124690952709
CA 2018-10-17 33541.0 1.3517178428027943
CA 2018-10-14 10781.0 1.3498688930242686
IR 2018-05-01 28455.0 1.3492989160248734
IR 2018-07-27 11005.0 1.349145995366499
CH 2018-05-06 2251.0 1.3471302230515647
AR 2018-12-04 2242.0 1.3466019516862722
SY 2018-04-08 16020.0 1.342459192032131
IR 2018-05-24 14621.0 1.3422089582275187
CA 2018-12-03 16961.0 1.3412775323992387
CA 2018-01-18 13738.0 1.3391088394259298
CH 2018-07-08 2205.0 1.3379688854271399
IR 2018-05-04 19262.0 1.3367872257942472
UK 2018-04-10 25664.0 1.3359392112341717
IR 2018-01-06 6834.0 1.3340485558215438
CA 2018-06-17 10593.0 1.3338400276510332
IR 2018-06-08 6583.0 1.3332144431395019
IR 2018-07-03 21279.0 1.33217180228695
UK 2018-05-07 5301.0 1.330003109313641
VE 2018-05-25 1774.0 1.3277788088281965
FR 2018-05-11 12034.0 1.326569345439236
SN 2018-06-12 31123.0 1.3251374520017312
FR 2018-11-08 2248.0 1.3230799740526948
CH 2018-05-21 22177.0 1.32178709939553
CA 2018-04-20 19129.0 1.3207583604210118
TU 2018-10-27 4528.0 1.3195349951540176
IR 2018-07-06 15913.0 1.3192152519592348
IR 2018-08-11 5996.0 1.3185618636916352
UK 2018-04-09 18497.0 1.3161985444258504
IZ 2018-09-09 15579.0 1.3148361603785157
CH 2018-05-16 8081.0 1.3134876782092146
RS 2018-12-07 14087.0 1.312625761771105
IR 2018-01-12 19851.0 1.311833354723165
CA 2018-01-19 18253.0 1.3110548495532597
IR 2018-06-05 31939.0 1.3101234237249797
IS 2018-05-07 2329.0 1.3096090542377208
UK 2018-04-11 18242.0 1.3050214344864912
CH 2018-12-05 14743.0 1.3037146579512924
RS 2018-12-08 19621.0 1.302046432587209
VE 2018-05-26 14188.0 1.3010315954907252
IZ 2018-09-05 18357.0 1.301017693612691
RS 2018-02-28 7664.0 1.3009898898566228
UK 2018-05-21 11404.0 1.2999750527601388
SN 2018-06-09 2957.0 1.2989602156636546
CA 2018-08-04 3257.0 1.2985570612006678
RS 2018-12-04 14318.0 1.2980843973475107
CH 2018-07-09 6810.0 1.2980009860793065
SY 2018-12-20 30894.0 1.2926487630362058
RS 2018-07-26 13216.0 1.2913975940131428
SN 2018-06-13 7445.0 1.2907442057455434
CH 2018-05-10 19231.0 1.2891732935276983
ID 2018-10-27 53.0 1.2841130099233116
IR 2018-06-06 42002.0 1.2839878930210051
CH 2018-05-15 8474.0 1.2837932667285288
UK 2018-04-27 13479.0 1.2823474714129899
CA 2018-04-19 16931.0 1.281388241828642
SA 2018-12-01 18669.0 1.277050855882025
UK 2018-04-08 8124.0 1.2756606680786218
UK 2018-06-13 13348.0 1.2738534239341983
RS 2018-02-11 30582.0 1.273797816422062
RS 2018-12-06 13831.0 1.2664576248200947
RS 2018-12-05 27060.0 1.2662769004056522
IR 2018-06-04 8450.0 1.266054470357108
VE 2018-05-24 2234.0 1.265595708381985
IR 2018-01-13 31617.0 1.2649284182363516
CA 2018-01-17 22089.0 1.2645113618953305
UK 2018-07-08 3193.0 1.2640525999202077
CA 2018-04-18 18547.0 1.258450143072494
CA 2018-05-31 17206.0 1.257351894707806
UK 2018-03-19 10345.0 1.2568931327326829
CA 2018-05-30 16372.0 1.2558504918801305
CH 2018-06-18 11364.0 1.2555446505633818
CA 2018-06-18 8726.0 1.2549190660518508
SY 2018-12-19 7622.0 1.2543768928085235
SA 2018-08-15 3313.0 1.2514714002994116
UK 2018-04-28 27937.0 1.2506233857393358
CA 2018-10-18 13234.0 1.2487744359608102
CA 2018-04-17 17350.0 1.245396279598541
RS 2018-05-06 6387.0 1.2431580772350623
CA 2018-06-01 18011.0 1.2414203424808083
CH 2018-05-22 12108.0 1.2403498978721885
UK 2018-04-29 3125.0 1.2380421861185393
CA 2018-07-25 18416.0 1.237277582826668
UK 2018-03-04 5144.0 1.236777115217443
CA 2018-07-24 18424.0 1.2365407832908646
CH 2018-08-05 3110.0 1.235915198779333
FR 2018-05-10 3506.0 1.2357205724868567
IR 2018-11-09 8241.0 1.2352062029995976
KN 2018-06-08 3686.0 1.2351227917313936
CA 2018-04-28 4775.0 1.2350810860972914
KN 2018-05-27 4184.0 1.2345806184880666
CH 2018-05-08 34641.0 1.2341496602690116
CA 2018-06-15 10975.0 1.2340384452447393
UK 2018-04-30 11743.0 1.2263090010578193
CA 2018-01-16 25882.0 1.226281197301751
CA 2018-05-29 33358.0 1.225669514668254
UK 2018-02-07 18936.0 1.2238205648897278
CH 2018-05-09 23575.0 1.2215406568921474
UK 2018-05-01 16301.0 1.2203868010153227
IR 2018-07-02 16038.0 1.219052220724056
CA 2018-07-26 6261.0 1.2161050225808423
CH 2018-05-23 11235.0 1.2149650685820517
UK 2018-11-22 11056.0 1.2136721939248871
UK 2018-03-18 11517.0 1.212365417389688
UK 2018-04-07 6920.0 1.2121290854631095
UK 2018-03-09 14277.0 1.211920557292599
IR 2018-06-03 2185.0 1.2110586408544894
CA 2018-06-06 12598.0 1.2098352755874946
IR 2018-08-24 4128.0 1.2078751107846968
VE 2018-05-27 59244.0 1.2072078206390633
UK 2018-06-23 12900.0 1.2061234741524092
IR 2018-07-07 14391.0 1.2040520943253388
UK 2018-03-08 16241.0 1.201480246889043
SA 2018-10-07 7012.0 1.2006461342070016
UK 2018-02-04 14635.0 1.1999510403053004
CH 2018-03-24 4221.0 1.199408867061973
UK 2018-02-06 15372.0 1.1989362032088162
VE 2018-05-28 9163.0 1.1985608525018974
IR 2018-10-03 30232.0 1.1971845665765284
UK 2018-03-21 22001.0 1.193597882043749
KN 2018-05-28 18718.0 1.1935005688975107
KN 2018-07-05 10469.0 1.1933198444830684
UK 2018-03-20 23703.0 1.1931947275807622
CH 2018-06-17 3934.0 1.192054773581972
SA 2018-11-30 14191.0 1.1918045397773593
CA 2018-06-19 13254.0 1.1911650533877938
RS 2018-07-29 2042.0 1.191095543997624
CA 2018-07-22 2925.0 1.190692389534637
KN 2018-04-21 24575.0 1.1871057050018574
CA 2018-10-19 8756.0 1.1822539495679811
CH 2018-03-08 8583.0 1.180655233594068
VE 2018-05-29 9601.0 1.1801547659848433
RS 2018-03-07 15959.0 1.1797377096438222
SA 2018-11-29 16006.0 1.1773743903780372
CA 2018-07-23 41474.0 1.172925789407148
RS 2018-03-04 1629.0 1.1712714659210985
UK 2018-02-05 19498.0 1.170048100654104
UK 2018-03-05 10324.0 1.1689359504113814
UK 2018-03-10 6580.0 1.167740388900455
SY 2018-12-24 13111.0 1.1651546395861259
CA 2018-08-03 6032.0 1.1647653870011732
SA 2018-12-05 16995.0 1.1642510175139138
UK 2018-07-16 10575.0 1.1642232137578457
IR 2018-12-04 11543.0 1.1634725123440082
IR 2018-10-30 4973.0 1.1627079090521366
CH 2018-03-09 22473.0 1.1626383996619665
SA 2018-08-07 12130.0 1.1618459926140268
CA 2018-06-16 23349.0 1.1597051033967862
RS 2018-07-30 3423.0 1.1596633977626845
UK 2018-06-12 15900.0 1.157522508545444
UK 2018-11-21 20627.0 1.1571749615945932
RS 2018-07-08 1798.0 1.1559376944495647
CA 2018-04-15 6133.0 1.1524483230630231
CA 2018-04-16 12435.0 1.152378813672853
CA 2018-05-27 3159.0 1.150891312723212
SY 2018-04-07 1105.0 1.150488158260225
IR 2018-10-04 13580.0 1.1499459850168983
SA 2018-11-15 14772.0 1.1488199328961417
SA 2018-11-26 7666.0 1.1483889746770866
FR 2018-11-21 6124.0 1.146998786873684
UK 2018-03-07 20636.0 1.1469431793615477
RS 2018-03-24 2852.0 1.1455112859240426
IR 2018-07-08 5070.0 1.1437040417796192
IR 2018-04-30 31598.0 1.142703106561169
UK 2018-05-22 11329.0 1.1408958624167451
UK 2018-03-23 8327.0 1.138935697613947
UK 2018-11-24 9398.0 1.1382684074683136
UK 2018-01-12 31543.0 1.1382684074683136
IR 2018-07-01 8826.0 1.137517706054476
FR 2018-12-09 16079.0 1.1353768168372356
SA 2018-11-16 9813.0 1.1333888482783696
SA 2018-08-09 11301.0 1.133124712595723
UK 2018-03-06 20454.0 1.132165483011375
UK 2018-02-03 4853.0 1.1320542679871026
CA 2018-12-02 5790.0 1.1320125623530006
IZ 2018-09-10 8923.0 1.131748426670354
RS 2018-04-20 18588.0 1.1270217881387843
SA 2018-12-06 20131.0 1.1267020449440017
RS 2018-03-06 40865.0 1.125436974042905
SY 2018-12-18 394.0 1.1228234209725076
RS 2018-05-23 12048.0 1.1220171120465339
CA 2018-03-08 13220.0 1.1185277406599927
IR 2018-01-09 6870.0 1.116942926564113
CH 2018-05-12 4097.0 1.1165258702230925
CH 2018-06-19 20359.0 1.1149688598832812
UK 2018-03-22 13776.0 1.1141486490792734
CH 2018-08-04 6976.0 1.113940120908763
RS 2018-05-22 5681.0 1.1139262190307289
IR 2018-09-12 15022.0 1.113523064567742
SA 2018-08-08 11386.0 1.1131477138608235
CA 2018-04-12 16936.0 1.1124387180810877
KN 2018-07-04 1450.0 1.1116185072770801
CA 2018-04-14 10213.0 1.1106453758146981
RS 2018-02-10 6376.0 1.1104646514002556
CA 2018-01-15 7031.0 1.1093107955234314
UK 2018-03-17 7314.0 1.108170841524641
UK 2018-11-03 6218.0 1.1078232945737903
SA 2018-12-07 12501.0 1.10675284996517
SA 2018-11-28 11508.0 1.1063774992582514
CH 2018-05-05 5004.0 1.1054738771860393
CA 2018-04-29 4956.0 1.1048343907964742
SY 2018-04-06 5722.0 1.1040280818705004
UK 2018-11-05 18989.0 1.1017203701168514
IR 2018-04-28 1929.0 1.100497004849857
CA 2018-03-09 19018.0 1.1002189672891765
CA 2018-08-10 18390.0 1.10009385038687
IS 2018-04-28 5808.0 1.0996628921678153
CA 2018-04-13 12438.0 1.0992597377048283
CH 2018-05-14 24667.0 1.0989956020221818
IR 2018-08-20 24475.0 1.0989260926320117
TU 2018-10-28 5039.0 1.0980641761939018
CH 2018-08-06 8152.0 1.0949084498801773
UK 2018-11-02 20256.0 1.0912661578352618
CA 2018-05-26 3550.0 1.089931577543995
IR 2018-04-29 5108.0 1.088555291618626
RS 2018-07-27 52687.0 1.0882633521799117
UK 2018-03-03 5517.0 1.088013118375299
CH 2018-03-10 17365.0 1.0858861310360923
UK 2018-05-06 2602.0 1.083759143696886
CH 2018-11-28 9554.0 1.083689634306716
CA 2018-05-28 7663.0 1.0833698911119334
CH 2018-03-11 12040.0 1.0828972272587762
UK 2018-03-11 9067.0 1.0825218765518576
SA 2018-08-06 45949.0 1.0818128807721221
RS 2018-05-24 15901.0 1.0814514319432373
CA 2018-10-01 18141.0 1.0810621793582844
IR 2018-12-03 4767.0 1.079894421603426
UK 2018-11-23 20375.0 1.0787683694826697
UK 2018-11-20 13630.0 1.0787127619705337
CH 2018-03-07 8085.0 1.078448626287887
UK 2018-01-11 14748.0 1.077461592947471
KN 2018-04-20 2020.0 1.0753346056082647
KN 2018-06-15 5383.0 1.074556100438359
UK 2018-01-16 22671.0 1.0738749084146915
UK 2018-05-24 12797.0 1.072582033757527
UK 2018-01-10 10775.0 1.0692594849073938
KN 2018-05-30 17110.0 1.0689397417126112
CA 2018-12-01 5193.0 1.0684670778594543
IR 2018-08-31 10321.0 1.067938806494161
UK 2018-05-23 15806.0 1.067883198982025
SA 2018-10-30 12121.0 1.0674939463970723
IR 2018-08-19 8024.0 1.0664791093005879
UK 2018-02-28 9935.0 1.0662010717399075
CA 2018-06-05 6664.0 1.0653391553017977
RS 2018-12-09 5704.0 1.0636014205475441
UK 2018-02-08 11861.0 1.0634624017672039
IS 2018-04-29 7653.0 1.0631843642065235
UK 2018-11-04 8334.0 1.0628090134996044
UK 2018-06-24 4821.0 1.062127821475937
RS 2018-03-05 5690.0 1.0617107651349165
UK 2018-06-09 9859.0 1.0594169552593014
RS 2018-03-26 11531.0 1.0571787528958227
KN 2018-05-31 17935.0 1.0562612289455768
RS 2018-03-03 1398.0 1.0559553876288281
IR 2018-01-14 9977.0 1.0552185880930245
RS 2018-07-28 3738.0 1.0551629805808886
RS 2018-03-23 11284.0 1.0542871622647445
CA 2018-03-07 11165.0 1.0538423021676557
CA 2018-04-11 15540.0 1.0521462730475042
CH 2018-07-07 4860.0 1.05106192656085
IR 2018-07-09 12697.0 1.0509368096585436
IS 2018-05-18 1299.0 1.0502000101227402
UK 2018-07-07 2808.0 1.0479340040031935
IR 2018-08-21 12897.0 1.0477949852228532
UK 2018-06-10 5382.0 1.047252811979526
UK 2018-06-25 8823.0 1.0470998913211518
UK 2018-02-02 10163.0 1.0469191669067093
CA 2018-01-14 3228.0 1.043791244349053
IR 2018-12-09 7075.0 1.0416642570098464
TU 2018-10-11 8612.0 1.039439956524402
IR 2018-01-07 10547.0 1.0376327123799778
CH 2018-04-08 5443.0 1.0370071278684467
RS 2018-05-27 2486.0 1.0346299067246276
CA 2018-09-30 11807.0 1.0340738316032667
KN 2018-03-05 25052.0 1.032919975726442
CA 2018-10-03 15137.0 1.0325585268975574
CA 2018-11-24 4068.0 1.031404671020733
UK 2018-11-18 793.0 1.0312239466062907
UK 2018-02-27 9407.0 1.0306261658508273
SA 2018-08-22 3385.0 1.0302786188999769
UK 2018-12-07 12547.0 1.0296113287543434
CA 2018-12-20 13876.0 1.029569623120241
UK 2018-03-15 10573.0 1.0289718423647778
IS 2018-04-27 13252.0 1.0287633141942676
UK 2018-06-11 17476.0 1.028513080389655
RS 2018-03-25 9131.0 1.0277762808538515
IR 2018-12-06 39453.0 1.0272480094885583
UK 2018-03-24 7343.0 1.026580719342925
UK 2018-12-04 21935.0 1.0248707883447394
KN 2018-05-26 586.0 1.0245371432719228
CA 2018-10-02 13771.0 1.023272072370826
UK 2018-04-06 10131.0 1.0230079366881795
CH 2018-06-20 27492.0 1.0222989409084442
CH 2018-08-07 26144.0 1.0216872582749468
IR 2018-09-14 8471.0 1.0200607385449654
UK 2018-03-16 12909.0 1.0188234713999367
CA 2018-07-21 2059.0 1.0183647094248138
IR 2018-06-30 8226.0 1.0180032605959293
CA 2018-01-20 11892.0 1.0175305967427721
FR 2018-11-15 5481.0 1.017447185474568
UK 2018-03-01 21121.0 1.0170162272555132
SA 2018-12-14 28075.0 1.0167242878167986
CH 2018-03-12 13670.0 1.0157928619885186
UK 2018-12-06 17595.0 1.013638070893244
KN 2018-05-29 23746.0 1.012706645064964
CH 2018-05-24 18150.0 1.0121922755777049
CA 2018-11-30 11760.0 1.0115944948222417
UK 2018-12-05 15555.0 1.0114137704077992
IR 2018-12-07 9526.0 1.0110940272130167
ID 2018-10-26 257.0 1.010273816409009
IR 2018-12-05 5942.0 1.0101208957506347
BE 2018-07-08 35.0 1.0099123675801243
CA 2018-08-27 10890.0 1.0091894699223547
IR 2018-01-08 16634.0 1.00789659526519
RS 2018-12-10 10953.0 1.0059781360964941
IR 2018-08-22 20207.0 1.0057696079259837
CA 2018-09-07 15692.0 1.0054359628531668
CA 2018-10-13 3775.0 1.0042264994642063
UK 2018-03-12 11075.0 1.0037260318549814
UK 2018-11-25 4857.0 1.0018631801984212
CA 2018-10-20 1926.0 0.99837380881188
IZ 2018-09-11 3842.0 0.998332103177778
CA 2018-11-23 12671.0 0.9979428505928253
RS 2018-08-04 19353.0 0.9974701867396681
FR 2018-05-12 2803.0 0.9967055834477968
RS 2018-08-06 10523.0 0.9958297651316528
RS 2018-08-07 12989.0 0.9950234562056791
UK 2018-03-14 14506.0 0.9949122411814069
RS 2018-02-21 10800.0 0.9927713519641665
UK 2018-11-06 14804.0 0.9917009073555462
RS 2018-02-09 6898.0 0.9908111871613686
CH 2018-05-27 4143.0 0.9893375880897615
UK 2018-05-25 26055.0 0.9891012561631829
CA 2018-09-04 11898.0 0.9888788261146384
RS 2018-05-26 8213.0 0.9878083815060181
YM 2018-06-16 3404.0 0.9865711143609895
UK 2018-03-13 17276.0 0.9847082627044296
SA 2018-11-27 15325.0 0.9841938932171707
VE 2018-05-30 1787.0 0.9838880519004221
CA 2018-03-10 22253.0 0.9825117659750531
UK 2018-05-26 4386.0 0.9822615321704407
KN 2018-04-24 8835.0 0.9822476302924067
CA 2018-11-21 26536.0 0.9821364152681343
IS 2018-02-12 4864.0 0.9818166720733518
CH 2018-06-21 9251.0 0.9806350124404594
IR 2018-09-13 9166.0 0.9796896847341452
UK 2018-02-26 16264.0 0.9794116471734647
CA 2018-11-22 15732.0 0.9789945908324437
UK 2018-05-05 3168.0 0.9787999645399674
CA 2018-04-10 16531.0 0.9784246138330486
SZ 2018-01-24 3483.0 0.9778268330775853
KN 2018-08-05 9905.0 0.9768258978591353
UK 2018-11-26 8809.0 0.9760334908111956
SZ 2018-01-25 20766.0 0.9751298687389837
KN 2018-08-04 39472.0 0.974768419910099
SA 2018-08-05 1005.0 0.9745181861054866
CA 2018-04-09 13383.0 0.9741150316424995
SN 2018-06-14 2214.0 0.9735172508870363
CA 2018-11-20 9591.0 0.9733921339847299
AR 2018-11-27 7530.0 0.9733643302286622
SA 2018-10-31 20835.0 0.9703754264513458
GM 2018-07-09 4080.0 0.969944468232291
IR 2018-08-23 9484.0 0.9695413137693042
UK 2018-11-01 12158.0 0.969026944282045
CH 2018-06-22 10153.0 0.9681094203317991
SA 2018-12-15 1776.0 0.9679147940393228
SA 2018-10-06 3236.0 0.9679008921612888
CA 2018-06-04 6517.0 0.9670250738451449
CA 2018-12-18 16498.0 0.9667192325283961
UK 2018-10-17 12923.0 0.9658990217243885
RS 2018-05-25 31239.0 0.9650510071643127
CH 2018-08-08 18205.0 0.9646478527013258
CA 2018-08-11 5123.0 0.9641334832140667
CH 2018-05-26 4407.0 0.962785001044766
UK 2018-11-07 6845.0 0.9625208653621194
UK 2018-06-08 9364.0 0.960894345632138
CA 2018-12-19 15228.0 0.9603660742668448
CH 2018-04-09 20904.0 0.9601158404622324
SA 2018-12-11 6143.0 0.9600602329500963
UK 2018-12-03 10186.0 0.9597126859992456
UK 2018-10-16 11278.0 0.9595319615848033
TU 2018-10-29 6036.0 0.9578776380987538
CA 2018-12-11 12110.0 0.9578359324646516
CH 2018-08-09 8107.0 0.9572659554652565
UK 2018-05-27 6530.0 0.9572103479531204
KN 2018-08-03 8845.0 0.956974016026542
IR 2018-06-09 2669.0 0.9562650202468067
FR 2018-05-07 1171.0 0.9559035714179218
CA 2018-09-08 6725.0 0.9543465610781103
UK 2018-11-19 18513.0 0.9509684047158414
UK 2018-10-15 18420.0 0.9499396657413233
CH 2018-05-02 6192.0 0.9497589413268809
CA 2018-09-29 4279.0 0.9495643150344044
CA 2018-04-30 10585.0 0.9494391981320983
UK 2018-01-17 15538.0 0.9484660666697161
SA 2018-12-12 8236.0 0.9481463234749336
UK 2018-10-23 11972.0 0.9480768140847634
IS 2018-04-30 39359.0 0.9476041502316066
CH 2018-07-05 14510.0 0.9466171168911907
SA 2018-11-02 7588.0 0.9461305511599993
RS 2018-08-05 3183.0 0.9449488915271068
IR 2018-04-27 4401.0 0.9437533300161804
CH 2018-08-10 13952.0 0.9435448018456699
FR 2018-05-09 26697.0 0.9433501755531933
IR 2018-07-19 6500.0 0.9420016933838927
CH 2018-07-04 5185.0 0.9418904783596206
UK 2018-01-18 11696.0 0.9405002905562176
CH 2018-04-21 2238.0 0.9389710839724744
FR 2018-05-06 5113.0 0.9386235370216236
UK 2018-02-23 13238.0 0.9378311299736841
CH 2018-03-23 13897.0 0.9373306623644589
SZ 2018-01-23 7784.0 0.9368162928771999
CH 2018-06-23 2137.0 0.9366355684627576
UK 2018-10-22 17641.0 0.9349812449767081
UK 2018-03-02 16922.0 0.9344390717333811
SA 2018-08-04 5427.0 0.9341749360507345
IR 2018-12-08 6740.0 0.9341332304166324
UK 2018-10-25 19331.0 0.9331879027103184
RS 2018-05-28 5004.0 0.9309079947127376
SA 2018-11-14 5026.0 0.9275993477406388
CH 2018-05-03 17444.0 0.9274881327163664
UK 2018-01-09 6786.0 0.9263898843516782
CH 2018-05-01 8212.0 0.925305537865024
IR 2018-05-25 17008.0 0.9251804209627178
CA 2018-03-11 6414.0 0.9250553040604114
CA 2018-10-04 12889.0 0.9237207237691446
UK 2018-10-18 14860.0 0.9212461894790874
CH 2018-07-06 37715.0 0.9211627782108834
IR 2018-07-10 12995.0 0.9211210725767813
FR 2018-05-08 27687.0 0.920995955674475
UK 2018-10-20 6742.0 0.920064529846195
UK 2018-02-09 12241.0 0.9199950204560251
UK 2018-11-08 14957.0 0.9196613753832082
CA 2018-11-19 7151.0 0.9192860246762893
RS 2018-12-11 13557.0 0.919077496505779
IR 2018-01-15 6440.0 0.9189940852375748
IS 2018-02-13 22920.0 0.9187438514329623
CA 2018-09-05 17121.0 0.9184519119942476
KN 2018-03-04 152.0 0.9181877763116011
GM 2018-07-10 5768.0 0.9180070518971587
CA 2018-06-20 21335.0 0.9177568180925463
FR 2018-11-16 2356.0 0.9162832190209391
CA 2018-05-14 9529.0 0.9159495739481225
IS 2018-02-11 14216.0 0.9158105551677822
UK 2018-10-24 9525.0 0.9137252734626777
RS 2018-08-21 22844.0 0.9120292443425262
UK 2018-10-31 10083.0 0.9107780753194635
UK 2018-10-19 10454.0 0.910416626490579
CA 2018-09-06 15492.0 0.9079003865664196
RS 2018-08-03 10704.0 0.907664054639841
UK 2018-02-25 8682.0 0.9066631194213911
IS 2018-02-10 12914.0 0.9064406893728467
SA 2018-12-16 1953.0 0.9035073931076667
CH 2018-06-16 8803.0 0.9031320424007477
UK 2018-05-28 5399.0 0.9026871823036586
SA 2018-11-03 4098.0 0.9024647522551144
CH 2018-05-13 7116.0 0.9011301719638475
UK 2018-12-08 3237.0 0.900866036281201
CH 2018-04-10 18509.0 0.9007548212569289
IZ 2018-07-16 11945.0 0.9002404517696698
SY 2018-04-20 1426.0 0.8996287691361725
UK 2018-02-24 2958.0 0.899475848477798
KN 2018-07-06 23753.0 0.8987112451859265
CH 2018-07-03 6130.0 0.8980439550402932
CH 2018-03-06 12569.0 0.8966398653588562
CA 2018-11-28 15128.0 0.894665798678024
CA 2018-11-29 17090.0 0.894165331068799
SA 2018-12-08 2633.0 0.8938594897520504
CH 2018-04-11 7569.0 0.8935675503133359
CA 2018-09-28 12893.0 0.8933451202647913
CA 2018-11-26 8514.0 0.891690796778742
RS 2018-05-21 13377.0 0.8901337864389307
CA 2018-11-18 6052.0 0.8896611225857737
CA 2018-01-13 9142.0 0.8895360056834675
IR 2018-05-26 5427.0 0.8894386925372293
IR 2018-11-19 18393.0 0.8889660286840723
GM 2018-07-11 47291.0 0.8847259558836935
CH 2018-11-11 8793.0 0.884044763860026
UK 2018-10-30 13613.0 0.8837111187872092
KN 2018-08-02 10142.0 0.8836416093970392
CA 2018-03-12 3857.0 0.8829882211294399
CH 2018-07-10 15474.0 0.8818621690086835
CA 2018-12-21 17800.0 0.8804997849613485
CH 2018-04-12 9543.0 0.8804441774492126
CA 2018-05-17 24157.0 0.8800132192301576
CA 2018-11-27 11998.0 0.8790261858897415
CA 2018-05-16 18050.0 0.8784701107683803
UK 2018-08-31 12558.0 0.8784145032562442
GM 2018-07-12 14380.0 0.8774552736718962
KN 2018-04-23 16828.0 0.8772328436233519
SN 2018-06-08 7669.0 0.8772189417453179
IR 2018-08-18 7309.0 0.8770104135748072
UK 2018-05-02 12396.0 0.8760511839904593
UK 2018-10-14 5956.0 0.8753977957228598
SA 2018-11-01 15046.0 0.8730483783351091
UK 2018-10-21 2744.0 0.8728259482865646
CA 2018-11-25 5826.0 0.8728120464085306
RS 2018-02-01 16355.0 0.8723393825553735
RS 2018-08-20 7343.0 0.871588681141536
CH 2018-11-07 6557.0 0.871546975507434
RS 2018-08-22 20034.0 0.8708935872398345
GM 2018-07-13 2052.0 0.8704765308988138
CA 2018-06-02 5721.0 0.8703236102404394
CA 2018-02-02 17797.0 0.8699065538994184
CH 2018-09-15 3144.0 0.8698648482653165
UK 2018-02-01 14012.0 0.8693782825341254
RS 2018-05-13 5235.0 0.8691419506075467
UK 2018-05-03 14384.0 0.8687805017786622
CH 2018-09-21 4692.0 0.8683356416815732
UK 2018-09-03 13948.0 0.8679046834625181
CA 2018-05-15 5417.0 0.8673903139752592
CA 2018-06-21 13191.0 0.867320804585089
CH 2018-09-25 15812.0 0.8671678839267148
RS 2018-02-02 21319.0 0.8667925332197959
UK 2018-03-26 12128.0 0.8654301491724612
IZ 2018-07-15 15445.0 0.8652633266360529
CA 2018-02-03 4060.0 0.8645960364904194
CH 2018-05-25 11743.0 0.8637341200523094
UK 2018-12-09 3012.0 0.8630390261506082
CH 2018-11-08 12463.0 0.8622327172246346
UK 2018-10-28 4144.0 0.8602586505438025
KN 2018-08-06 5699.0 0.860022318617224
KN 2018-07-03 13087.0 0.8595635566421009
RS 2018-01-31 5469.0 0.8593967341056926
CA 2018-07-20 5966.0 0.8573809617907584
IR 2018-10-05 4126.0 0.8560602833775256
CH 2018-11-12 8098.0 0.8540862166966935
CH 2018-01-14 6669.0 0.8539610997943873
RS 2018-04-21 5324.0 0.8534606321851623
UK 2018-02-10 6643.0 0.8534328284290942
SA 2018-08-10 7530.0 0.8531269871123455
IS 2018-04-23 1538.0 0.8527238326493587
IR 2018-11-18 4890.0 0.8525014026008143
UK 2018-10-27 4271.0 0.8525014026008141
CA 2018-10-12 11933.0 0.8521677575279974
CA 2018-12-17 11961.0 0.851945327479453
UK 2018-09-24 12292.0 0.8518758180892829
CH 2018-11-06 12781.0 0.8512919392118538
IR 2018-11-20 8532.0 0.8512224298216837
CA 2018-12-15 2845.0 0.8510834110413432
KN 2018-04-19 7284.0 0.8501797889691315
UK 2018-09-23 3256.0 0.8496515176038383
UK 2018-03-25 5802.0 0.849053736848375
CH 2018-12-06 16553.0 0.8489981293362391
UK 2018-11-12 13789.0 0.8487756992876945
CH 2018-09-26 11491.0 0.8485532692391501
CA 2018-09-09 6649.0 0.8482057222882992
UK 2018-11-09 12342.0 0.8473716096062576
CH 2018-03-05 14966.0 0.8472881983380534
AR 2018-12-05 2588.0 0.8468850438750665
KN 2018-05-25 14726.0 0.8468433382409645
UK 2018-01-19 9315.0 0.846106538705161
IZ 2018-07-17 21518.0 0.8458145992664464
CA 2018-09-18 16734.0 0.844716350901758
CH 2018-11-09 16631.0 0.844716350901758
SY 2018-02-23 5520.0 0.8444383133410774
UK 2018-10-13 6284.0 0.844257588926635
UK 2018-02-22 15888.0 0.8436598081711718
UK 2018-12-02 2852.0 0.8429786161475044
UK 2018-02-20 12166.0 0.8428395973671641
UK 2018-03-27 15621.0 0.8426171673186197
CH 2018-11-18 6278.0 0.8422696203677689
CH 2018-08-03 19219.0 0.8422418166117007
UK 2018-10-12 9764.0 0.8417830546365778
RS 2018-08-23 11478.0 0.8413381945394889
UK 2018-09-01 2609.0 0.8406152968817194
CA 2018-09-19 12751.0 0.8402260442967666
UK 2018-06-07 15156.0 0.8394753428829291
SA 2018-12-13 18078.0 0.839350225980623

coveragewithoutusa

Investigating a seemingly inflentiual (2018-10-18) in Saudi Arabia (SA) using the scalable web scraper library goose

//Initializing the web scraper.
val urlContentFetcher = {new ContentFetcher()
    .setInputCol("sourceUrl")
    .setOutputTitleCol("title")
    .setOutputContentCol("content")
    .setOutputKeywordsCol("keywords")
    .setOutputPublishDateCol("publishDateCollected")
    .setOutputDescriptionCol("description")
    .setUserAgent("Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.2.8) Gecko/20100723 Ubuntu/10.04 Firefox/")
    .setConnectionTimeout(1000)
    .setSocketTimeout(1000)
                        }
val big_event_SA = oilEventCoverageDF.filter($"country" ==="SA" && $"eventDay" === "2018-10-18").orderBy(desc("coverage")).limit(100)
val SAEventURLS = urlContentFetcher.transform(big_event_SA.select($"country",$"coverage",$"date",$"sourceUrl",$"eventId")).filter(col("description") =!= "").orderBy(desc("coverage"))
SAEventURLS.printSchema
root
 |-- sourceUrl: string (nullable = true)
 |-- description: string (nullable = false)
 |-- content: string (nullable = false)
 |-- keywords: array (nullable = false)
 |    |-- element: string (containsNull = true)
 |-- title: string (nullable = false)
 |-- publishDateCollected: date (nullable = true)
 |-- country: string (nullable = true)
 |-- coverage: double (nullable = true)
 |-- date: date (nullable = true)
 |-- eventId: integer (nullable = true)
SAEventURLS.select($"description").show(6,false)
+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|description                                                                                                                                                                                           |
+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|The old Saudi Arabia was a place the United States often turned to in times of turbulence — when world oil prices were spiking or political tensions in the                                           |
|APPS-KHASHOGGI/ (COLUMN):COLUMN-Khashoggi case shows America's collapsing Mideast clout                                                                                                               |
|The disappearance of Jamal Khashoggi reveals the true nature of the Saudi crown prince, says author Dilip Hiro                                                                                        |
|Everything you wanted to know about U.S.-Saudi arms sales and the politics around them, but were afraid to ask.                                                                                       |
|The money transfer curiously coincided with the visit of US Secretary of State Mike Pompeo to Riyadh to sort out frictions between the allies over the Khashoggi killing in Turkey earlier this month.|
|If you had to pick a year in the past decade when the contradictions of the American-Saudi relationship seemed likeliest to explode into crisis, 2018 would not be the obvious choice.                |
+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
only showing top 6 rows

Investigating seemingly inflentiual events in Iran (IR) using goose on:

  • 2018-05-10
  • 2018-09-25

Events on 2018-05-10

val big_event_IR1 = oilEventCoverageDF.filter($"country" ==="IR" && $"eventDay" === "2018-05-10").orderBy(desc("coverage")).limit(100)
val IR1EventURLS = urlContentFetcher.transform(big_event_IR1.select($"country",$"coverage",$"date",$"sourceUrl",$"eventId")).filter(col("description") =!= "").orderBy(desc("coverage"))
IR1EventURLS.select($"description").show(10,false)
+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|description                                                                                                                                                                                                                                               |
+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|See related links to what you are looking for.                                                                                                                                                                                                            |
|See related links to what you are looking for.                                                                                                                                                                                                            |
|See related links to what you are looking for.                                                                                                                                                                                                            |
|Iran's Supreme Leader Ayatollah Ali Khamenei said Wednesday that US President Donald Trump's anti-Iran remarks, upon announcement of his withdrawal for the nuclear deal on Tuesday, was                                                                  |
|Before the May 12 deadline, President Donald Trump has withdrawn the US from the multilateral Iran nuclear deal, with the objective to find a different method of dealing with the purported Iranian...                                                   |
|Before the May 12 deadline, President Donald Trump has withdrawn the US from the multilateral Iran nuclear deal, with the objective to find a different method of dealing with the purported Iranian...                                                   |
|It is dangerous that neither Trump nor Netanyahu appears to fully grasp the dire regional and international implications of the unilateral decertification of the deal by the US.                                                                         |
|So, Bibi helps Trump decide on Iran, he threatens to strike Syria if the Russians sell them S-400 and now hes a guest of honor at the Victory Day parade in Moscow. How much weirder can Middle Eastern politics get, I wonder.                           |
|Canadian stocks may continue to rise Thursday, stoked by higher oil prices and a diplomatic breakthrough with North Korea.  Three Americans Americans ere freed Wednesday ahead of President Trump's upcoming summit with North Korean leader Kim Jong Un.|
|For Iran, Iraq is the most important Arab state, even more than Syria and Lebanon                                                                                                                                                                         |
+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
only showing top 10 rows

Events on 2018-09-25

val big_event_IR2 = oilEventCoverageDF.filter($"country" ==="IR" && $"eventDay" === "2018-09-25").orderBy(desc("coverage")).limit(100)
val IR2EventURLS = urlContentFetcher.transform(big_event_IR2.select($"country",$"coverage",$"date",$"sourceUrl",$"eventId")).filter(col("description") =!= "").orderBy(desc("coverage"))
IR2EventURLS.select($"description").show(6,false)
+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|description                                                                                                                                                                                                                                                                                                                                    |
+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|Following the deadly attack in the Iranian city of Ahvaz on Saturday morning, which killed 25 and wounded 53 during a parade commemorating the 38th anniversary of the...                                                                                                                                                                      |
|Following the deadly attack in the Iranian city of Ahvaz on Saturday morning, which killed 25 and wounded 53 during a parade commemorating the 38th anniversary of the...                                                                                                                                                                      |
|Trump said his administration has accomplished more in less than two years 'than almost any administration in the history of our country'                                                                                                                                                                                                      |
|A crowd largely comprised of Iranian-Americans gathered to protest the Iranian regime and President Hassan Rouhani.                                                                                                                                                                                                                            |
|With European companies abandoning Iran in the face of growing U.S. pressure, European politicians backing Iran are counting on oil demand from China, India and Turkey to keep the 2015 nuclear deal alive. Since the U.S. withdrawal from the nuclear deal in May, Iran has threatened to resume its nuclear program if it loses economic ...|
|President Trump won't meet with Iranian President Hassan Rouhani during his time at the United Nations General Assembly meeting and blasted the country in his speech to delegates.                                                                                                                                                            |
+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
only showing top 6 rows

Enrich Data with Trend Calculus

For more information about Trend Calculus, see https://lamastex.github.io/spark-trend-calculus-examples/

val rootTrendPath = TrendUtils.getStreamableTrendCalculusPath
val oilGoldPath = rootTrendPath +"oilGoldDelta"

Gather trend reversals of all oil price data

val oil_data_all = spark.read.format("delta").load(oilGoldPath).as[TickerPoint].filter($"ticker" === "BCOUSD")
val trend_oil_all = new TrendCalculus2(oil_data_all,2,spark).nReversalsJoinedWithMaxRev(15)
trend_oil_all.write.parquet(rootCheckpointPath+"trend_oil_all")

Code needed for the interactive plot below

from plotly.offline import plot
from plotly.graph_objs import *
from datetime import *
from pyspark.sql import functions as F
import pyspark.sql.functions
from pyspark.sql.functions import col, avg
rootCheckpointPath = GdeltUtils.getEOICheckpointPath()
trend_oil_all = spark.read.parquet(rootCheckpointPath+"trend_oil_all")

oil_gas_cov_us_2015_2018 = spark.read.parquet(rootCheckpointPath+"oil_gas_cov_norm").select(F.col('date'),F.col('country'),F.col('coverage')).filter(F.col('country') == 'US').drop('country').filter(F.col('date')>'2015-01-01').filter(F.col('date')<'2018-12-31')

trend_oil_2015_2018 = trend_oil_all.filter(F.col('x')>'2015-01-01').filter(F.col('x')<'2018-12-31').orderBy(F.col('x'))

max_price = trend_oil_2015_2018.agg({'y': 'max'}).first()[0]
min_price = trend_oil_2015_2018.agg({'y': 'min'}).first()[0]
trend_oil_2015_2018_2 =trend_oil_2015_2018.withColumn('sy', (F.col('y')-min_price)/(max_price-min_price))

fullTS = trend_oil_2015_2018_2.filter("maxRev > 2").select("x","sy","maxRev").collect()
coverage =oil_gas_cov_us_2015_2018.collect()

TS = [row for row in fullTS]
numReversals = 15
startReversal = 7

allData = {'x': [row['x'] for row in TS], 'y': [row['sy'] for row in TS], 'maxRev': [row['maxRev'] for row in TS]}
allDataCov = {'x': [row['date'] for row in coverage], 'y': [row['coverage'] for row in coverage]}

temp2 = max(allDataCov['y'])-min(allDataCov['y'])
standardCoverage = list(map(lambda x: (x-min(allDataCov['y']))/temp2,allDataCov['y']))

revTS = [row for row in TS if row[2] >= startReversal]
colorList = ['rgba(' + str(tmp) + ',' + str(255-tmp) + ',' + str(255-tmp) + ',1)' for tmp in [int(i*255/(numReversals-startReversal+1)) for i in range(1,numReversals-startReversal+2)]]

def getRevTS(tsWithRevMax, revMax):
  x = [row[0] for row in tsWithRevMax if row[2] >= revMax]
  y = [row[1] for row in tsWithRevMax if row[2] >= revMax]
  return x,y,revMax

reducedData = [getRevTS(revTS, i) for i in range(startReversal, numReversals+1)]

markerPlots = [Scattergl(x=x, y=y, mode='markers', marker=dict(color=colorList[i-startReversal], size=i), name='Reversal ' + str(i)) for (x,y,i) in [getRevTS(revTS, i) for i in range(startReversal, numReversals+1)]]

Plot of oil price together with oil and gas coverage for USA

p = plot(
  [Scattergl(x=allData['x'], y=allData['y'], mode='lines', name='Oil Price'),Scattergl(x=allDataCov['x'], y=standardCoverage, mode='lines', name='Oil and gas coverage usa ')] + markerPlots 
  ,
  output_type='div'
)
displayHTML(p)

oiltrendevents

ScaDaMaLe Course site and book

Detecting Persons of Interest to OIL/GAS Price Trends

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


Here we will build a pipeline to investigate possible persons, organisations and other entities related to oil and gas that are reported in mass media around the world and their possible co-occurrence with certain trends and trend-reversals in oil price.

Steps:

  • Step 0. Setting up and loading GDELT delta.io tables
  • Step 1. Create a graph of persons related to gas and oil
  • Step 2. Extract communties
  • Step 3. Find key Influencers
  • Step 4. Visualisation

Resources:

This builds on the following libraries and its antecedents therein:

This work was inspired by:

import spark.implicits._
import io.delta.tables._
import com.aamend.spark.gdelt._
import org.apache.spark.sql.{Dataset,DataFrame,SaveMode}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions._
import org.graphframes.GraphFrame 
import org.apache.spark.sql.SparkSession
"./000b_gdelt_utils"
val rootPath = GdeltUtils.getGdeltV1Path
val rootCheckpointPath = GdeltUtils.getPOICheckpointPath
val gkgPath = rootPath+"gkg"
defined object GdeltUtils
val gkg_v1 = spark.read.format("delta").load(gkgPath).as[GKGEventV1]
val gkg_v1_filt = gkg_v1.filter($"publishDate">"2013-04-01 00:00:00" && $"publishDate"<"2019-12-31 00:00:00")

val oil_gas_themeGKG = gkg_v1_filt.filter(c =>c.themes.contains("ENV_GAS") || c.themes.contains("ENV_OIL"))

The first step is to create a graph between people releated to gas and oil where the edges are number of articles that they are mentioned in together.

Create the GraphFrame of Interest

val edges = oil_gas_themeGKG.select($"persons",$"numArticles")
                          .withColumn("src",explode($"persons"))
                          .withColumn("dst",explode($"persons"))
                          .filter($"src".notEqual($"dst") && $"src" =!= "" && $"dst" =!= "")
                          .groupBy($"src",$"dst")
                          .agg(sum("numArticles").as("count"))
                          .toDF()

val vertices = oil_gas_themeGKG.select($"persons",$"numArticles")
                          .withColumn("id",explode($"persons"))
                          .filter($"id" =!= "")
                          .drop($"persons")
                          .groupBy($"id")
                          .agg(sum("numArticles").as("numArticles"))
                          .toDF()
val pers_graph = GraphFrame(vertices,edges)

Count how many vertices and edges there is in our graph

println("vertex count: " +pers_graph.vertices.count())
println("edge count: " + pers_graph.edges.count())
vertex count: 3422103
edge count: 133703116
val fil_pers_graph = pers_graph.filterEdges($"count" >10).dropIsolatedVertices()

Count how many vertices and edges are in our graph after being filtered

println("filtered vertex count: " +fil_pers_graph.vertices.count())
println("filtered edge count: " + fil_pers_graph.edges.count())
filtered vertex count: 214514
filtered edge count: 4525828
sc.setCheckpointDir(rootCheckpointPath)

Compute the connected components

val comp_vertices = fil_pers_graph.connectedComponents.run()

Checkpoint

comp_vertices.write.parquet(rootCheckpointPath+"comp_vertices")
val comp_vertices = spark.read.parquet(rootCheckpointPath+"comp_vertices")
val comp_graph = GraphFrame(comp_vertices,fil_pers_graph.edges)

Note that almost all edges and vertices are in the connected component labelled 0, our giant component.

comp_graph.vertices.groupBy($"component").agg(count("component").as("count")).orderBy(desc("count")).show()
+------------+------+
|   component| count|
+------------+------+
|           0|195316|
|  8589935254|    97|
| 34359738385|    23|
|        1084|    22|
|         150|    21|
| 94489280677|    20|
| 94489280553|    19|
| 60129543033|    19|
|395136991484|    19|
|180388626903|    17|
|146028888838|    17|
| 25769804801|    16|
| 17179869819|    16|
|231928234487|    16|
|146028888649|    15|
| 51539608679|    15|
| 51539607916|    15|
| 60129543026|    15|
| 42949673514|    15|
|  8589935379|    15|
+------------+------+
only showing top 20 rows

Filter out the graph to only focus on the giant component

val big_comp_graph = comp_graph.filterVertices($"component" === 0)
big_comp_graph: org.graphframes.GraphFrame = GraphFrame(v:[id: string, numArticles: bigint ... 1 more field], e:[src: string, dst: string ... 1 more field])

Step 2. Extract communities

Next, let us extract communities within the giant component.

There are many algorithms for community structure detection:

We use a simple scalable one via label propagation here.

Apply label propagation to find interesting community structures

val label_vertices = big_comp_graph.labelPropagation.maxIter(10).run()
label_vertices: org.apache.spark.sql.DataFrame = [id: string, numArticles: bigint ... 2 more fields]

Checkpoint

label_vertices.write.parquet(rootCheckpointPath+"label_vertices")
big_comp_graph.edges.write.parquet(rootCheckpointPath+"label_edges")
val label_vertices = spark.read.parquet(rootCheckpointPath+"label_vertices")
val label_edges = spark.read.parquet(rootCheckpointPath+"label_edges")
val label_graph = GraphFrame(label_vertices,label_edges)

Step 3. Find key Influencers

Apply page rank to find the key influencers

val com_rank_graph =label_graph.pageRank.resetProbability(0.15).tol(0.015).run()
com_rank_graph: org.graphframes.GraphFrame = GraphFrame(v:[id: string, numArticles: bigint ... 3 more fields], e:[src: string, dst: string ... 2 more fields])

Checkpoint

com_rank_graph.vertices.write.parquet(rootCheckpointPath+"com_rank_vertices")
com_rank_graph.edges.write.parquet(rootCheckpointPath+"com_rank_edges")
val com_rank_vertices = spark.read.parquet(rootCheckpointPath+"com_rank_vertices")
val com_rank_edges =spark.read.parquet(rootCheckpointPath+"com_rank_edges")
val com_rank_graph = GraphFrame(com_rank_vertices,com_rank_edges)
com_rank_graph.vertices.groupBy($"label").agg(count($"label").as("count")).orderBy(desc("count")).show()

Step 4. Visualisation

Look at the top three communities

val toplabel1 = com_rank_graph.filterVertices($"label" === 1520418423783L)
val toplabel2 = com_rank_graph.filterVertices($"label" === 8589934959L)
val toplabel3 =com_rank_graph.filterVertices($"label" === 1580547965452L)

Filter out the top 100 according to pagerank score

val toplabel1Filt =  toplabel1.filterVertices($"pagerank" >=55.47527731815801)

Filter out edges to make the graph more comprehensive

val toplabel1FiltE = toplabel1Filt.filterEdges($"count">2000).dropIsolatedVertices()

In the interactive d3 graph below, the size of circle correlates with pagerank score.

case class Edge(src: String, dst: String, count: Long)

case class Node(name: String,importance: Double)
case class Link(source: Int, target: Int, value: Long)
case class Graph(nodes: Seq[Node], links: Seq[Link])

object graphs {

val sqlContext = SparkSession.builder().getOrCreate().sqlContext
import sqlContext.implicits._
  
def force(vertices: Dataset[Node],clicks: Dataset[Edge], height: Int = 100, width: Int = 960): Unit = {
  val data = clicks.collect()
  val nodes = vertices.collect()
  val links = data.map { t =>
    Link(nodes.indexWhere(_.name == t.src.replaceAll("_", " ")), nodes.indexWhere(_.name == t.dst.replaceAll("_", " ")), t.count / 20 + 1)
  }
  showGraph(height, width, Seq(Graph(nodes, links)).toDF().toJSON.first())
}

/**
 * Displays a force directed graph using d3
 * input: {"nodes": [{"name": "..."}], "links": [{"source": 1, "target": 2, "value": 0}]}
 */
def showGraph(height: Int, width: Int, graph: String): Unit = {

displayHTML(s"""
<style>

.node_circle {
  stroke: #777;
  stroke-width: 1.3px;
}

.node_label {
  pointer-events: none;
}

.link {
  stroke: #777;
  stroke-opacity: .2;
}

.node_count {
  stroke: #777;
  stroke-width: 1.0px;
  fill: #999;
}

text.legend {
  font-family: Verdana;
  font-size: 13px;
  fill: #000;
}

.node text {
  font-family: "Helvetica Neue","Helvetica","Arial",sans-serif;
  font-size: function(d) {return (d.importance)+ "px"};
  font-weight: 200;
}

</style>

<div id="clicks-graph">
<script src="//d3js.org/d3.v3.min.js"></script>
<script>

var graph = $graph;

var width = $width,
    height = $height;

var color = d3.scale.category20();

var force = d3.layout.force()
    .charge(-200)
    .linkDistance(350)
    .size([width, height]);

var svg = d3.select("#clicks-graph").append("svg")
    .attr("width", width)
    .attr("height", height);
    
force
    .nodes(graph.nodes)
    .links(graph.links)
    .start();

var link = svg.selectAll(".link")
    .data(graph.links)
    .enter().append("line")
    .attr("class", "link")
    .style("stroke-width", function(d) { return Math.sqrt(d.value)/10; });

var node = svg.selectAll(".node")
    .data(graph.nodes)
    .enter().append("g")
    .attr("class", "node")
    .call(force.drag);

node.append("circle")
    .attr("r", function(d) { return Math.sqrt(d.importance); })
    .style("fill", function (d) {
    if (d.name.startsWith("other")) { return color(1); } else { return color(2); };
})

node.append("text")
      .attr("dx", function(d) { return (Math.sqrt(d.importance)*30)/Math.sqrt(1661.1815574713858); })
      .attr("dy", ".35em")
      .text(function(d) { return d.name });
      
//Now we are giving the SVGs co-ordinates - the force layout is generating the co-ordinates which this code is using to update the attributes of the SVG elements
force.on("tick", function () {
    link.attr("x1", function (d) {
        return d.source.x;
    })
        .attr("y1", function (d) {
        return d.source.y;
    })
        .attr("x2", function (d) {
        return d.target.x;
    })
        .attr("y2", function (d) {
        return d.target.y;
    });
    d3.selectAll("circle").attr("cx", function (d) {
        return d.x;
    })
        .attr("cy", function (d) {
        return d.y;
    });
    d3.selectAll("text").attr("x", function (d) {
        return d.x;
    })
        .attr("y", function (d) {
        return d.y;
    });
});
</script>
</div>
""")
}
  
  def help() = {
displayHTML("""
<p>
Produces a force-directed graph given a collection of edges of the following form:</br>
<tt><font color="#a71d5d">case class</font> <font color="#795da3">Edge</font>(<font color="#ed6a43">src</font>: <font color="#a71d5d">String</font>, <font color="#ed6a43">dest</font>: <font color="#a71d5d">String</font>, <font color="#ed6a43">count</font>: <font color="#a71d5d">Long</font>)</tt>
</p>
<p>Usage:<br/>
<tt><font color="#a71d5d">import</font> <font color="#ed6a43">d3._</font></tt><br/>
<tt><font color="#795da3">graphs.force</font>(</br>
&nbsp;&nbsp;<font color="#ed6a43">height</font> = <font color="#795da3">500</font>,<br/>
&nbsp;&nbsp;<font color="#ed6a43">width</font> = <font color="#795da3">500</font>,<br/>
&nbsp;&nbsp;<font color="#ed6a43">clicks</font>: <font color="#795da3">Dataset</font>[<font color="#795da3">Edge</font>])</tt>
</p>""")
  }
}
graphs.force(
  height = 800,
  width = 1200,
  clicks = toplabel1FiltE.edges.as[Edge],
  vertices = toplabel1FiltE.vertices.select($"id".as("name"),$"pagerank".as("importance")).as[Node]
  )

usa_cluster

Filter out the top 100 according to pagerank score

val toplabel2Filt =  toplabel2.filterVertices($"pagerank" >=7.410990956624706)

Filter out edges to and vertices with small amount of edges to make the graph more comprehensive

val toplabel2FiltE = toplabel2Filt.filterEdges($"count">136).dropIsolatedVertices()

Nigerian cluster

case class Edge(src: String, dst: String, count: Long)

case class Node(name: String,importance: Double)
case class Link(source: Int, target: Int, value: Long)
case class Graph(nodes: Seq[Node], links: Seq[Link])

object graphs {
// val sqlContext = SQLContext.getOrCreate(org.apache.spark.SparkContext.getOrCreate())  /// fix
val sqlContext = SparkSession.builder().getOrCreate().sqlContext
import sqlContext.implicits._
  
def force(vertices: Dataset[Node],clicks: Dataset[Edge], height: Int = 100, width: Int = 960): Unit = {
  val data = clicks.collect()
  val nodes = vertices.collect()
  val links = data.map { t =>
    Link(nodes.indexWhere(_.name == t.src.replaceAll("_", " ")), nodes.indexWhere(_.name == t.dst.replaceAll("_", " ")), t.count / 20 + 1)
  }
  showGraph(height, width, Seq(Graph(nodes, links)).toDF().toJSON.first())
}

/**
 * Displays a force directed graph using d3
 * input: {"nodes": [{"name": "..."}], "links": [{"source": 1, "target": 2, "value": 0}]}
 */
def showGraph(height: Int, width: Int, graph: String): Unit = {

displayHTML(s"""
<style>

.node_circle {
  stroke: #777;
  stroke-width: 1.3px;
}

.node_label {
  pointer-events: none;
}

.link {
  stroke: #777;
  stroke-opacity: .2;
}

.node_count {
  stroke: #777;
  stroke-width: 1.0px;
  fill: #999;
}

text.legend {
  font-family: Verdana;
  font-size: 13px;
  fill: #000;
}

.node text {
  font-family: "Helvetica Neue","Helvetica","Arial",sans-serif;
  font-size: function(d) {return (d.importance)+ "px"};
  font-weight: 200;
}

</style>

<div id="clicks-graph">
<script src="//d3js.org/d3.v3.min.js"></script>
<script>

var graph = $graph;

var width = $width,
    height = $height;

var color = d3.scale.category20();

var force = d3.layout.force()
    .charge(-200)
    .linkDistance(350)
    .size([width, height]);

var svg = d3.select("#clicks-graph").append("svg")
    .attr("width", width)
    .attr("height", height);
    
force
    .nodes(graph.nodes)
    .links(graph.links)
    .start();

var link = svg.selectAll(".link")
    .data(graph.links)
    .enter().append("line")
    .attr("class", "link")
    .style("stroke-width", function(d) { return Math.sqrt(d.value)/10; });

var node = svg.selectAll(".node")
    .data(graph.nodes)
    .enter().append("g")
    .attr("class", "node")
    .call(force.drag);

node.append("circle")
    .attr("r", function(d) { return Math.sqrt(d.importance); })
    .style("fill", function (d) {
    if (d.name.startsWith("other")) { return color(1); } else { return color(2); };
})

node.append("text")
      .attr("dx", function(d) { return (Math.sqrt(d.importance)*30)/Math.sqrt(453.6031403843406); })
      .attr("dy", ".35em")
      .text(function(d) { return d.name });
      
//Now we are giving the SVGs co-ordinates - the force layout is generating the co-ordinates which this code is using to update the attributes of the SVG elements
force.on("tick", function () {
    link.attr("x1", function (d) {
        return d.source.x;
    })
        .attr("y1", function (d) {
        return d.source.y;
    })
        .attr("x2", function (d) {
        return d.target.x;
    })
        .attr("y2", function (d) {
        return d.target.y;
    });
    d3.selectAll("circle").attr("cx", function (d) {
        return d.x;
    })
        .attr("cy", function (d) {
        return d.y;
    });
    d3.selectAll("text").attr("x", function (d) {
        return d.x;
    })
        .attr("y", function (d) {
        return d.y;
    });
});
</script>
</div>
""")
}
  
  def help() = {
displayHTML("""
<p>
Produces a force-directed graph given a collection of edges of the following form:</br>
<tt><font color="#a71d5d">case class</font> <font color="#795da3">Edge</font>(<font color="#ed6a43">src</font>: <font color="#a71d5d">String</font>, <font color="#ed6a43">dest</font>: <font color="#a71d5d">String</font>, <font color="#ed6a43">count</font>: <font color="#a71d5d">Long</font>)</tt>
</p>
<p>Usage:<br/>
<tt><font color="#a71d5d">import</font> <font color="#ed6a43">d3._</font></tt><br/>
<tt><font color="#795da3">graphs.force</font>(</br>
&nbsp;&nbsp;<font color="#ed6a43">height</font> = <font color="#795da3">500</font>,<br/>
&nbsp;&nbsp;<font color="#ed6a43">width</font> = <font color="#795da3">500</font>,<br/>
&nbsp;&nbsp;<font color="#ed6a43">clicks</font>: <font color="#795da3">Dataset</font>[<font color="#795da3">Edge</font>])</tt>
</p>""")
  }
}
graphs.force(
  height = 800,
  width = 1200,
  clicks = toplabel2FiltE.edges.as[Edge],
  vertices = toplabel2FiltE.vertices.select($"id".as("name"),$"pagerank".as("importance")).as[Node]
  )

nigerian_cluster

Filter out the top 100 according to pagerank score

val toplabel3Filt =  toplabel3.filterVertices($"pagerank" >=3.160183413696083).filterEdges($"count">4*18).dropIsolatedVertices()

Filter out edges to and vertices with small amount of edges to make the graph more comprehensive

val toplabel3FiltE = toplabel3Filt.filterEdges($"count">50).dropIsolatedVertices()

Malaysian cluster

case class Edge(src: String, dst: String, count: Long)

case class Node(name: String,importance: Double)
case class Link(source: Int, target: Int, value: Long)
case class Graph(nodes: Seq[Node], links: Seq[Link])

object graphs {
// val sqlContext = SQLContext.getOrCreate(org.apache.spark.SparkContext.getOrCreate())  /// fix
val sqlContext = SparkSession.builder().getOrCreate().sqlContext
import sqlContext.implicits._
  
def force(vertices: Dataset[Node],clicks: Dataset[Edge], height: Int = 100, width: Int = 960): Unit = {
  val data = clicks.collect()
  val nodes = vertices.collect()
  val links = data.map { t =>
    Link(nodes.indexWhere(_.name == t.src.replaceAll("_", " ")), nodes.indexWhere(_.name == t.dst.replaceAll("_", " ")), t.count / 20 + 1)
  }
  showGraph(height, width, Seq(Graph(nodes, links)).toDF().toJSON.first())
}

/**
 * Displays a force directed graph using d3
 * input: {"nodes": [{"name": "..."}], "links": [{"source": 1, "target": 2, "value": 0}]}
 */
def showGraph(height: Int, width: Int, graph: String): Unit = {

displayHTML(s"""
<style>

.node_circle {
  stroke: #777;
  stroke-width: 1.3px;
}

.node_label {
  pointer-events: none;
}

.link {
  stroke: #777;
  stroke-opacity: .2;
}

.node_count {
  stroke: #777;
  stroke-width: 1.0px;
  fill: #999;
}

text.legend {
  font-family: Verdana;
  font-size: 13px;
  fill: #000;
}

.node text {
  font-family: "Helvetica Neue","Helvetica","Arial",sans-serif;
  font-size: function(d) {return (d.importance)+ "px"};
  font-weight: 200;
}

</style>

<div id="clicks-graph">
<script src="//d3js.org/d3.v3.min.js"></script>
<script>

var graph = $graph;

var width = $width,
    height = $height;

var color = d3.scale.category20();

var force = d3.layout.force()
    .charge(-200)
    .linkDistance(300)
    .size([width, height]);

var svg = d3.select("#clicks-graph").append("svg")
    .attr("width", width)
    .attr("height", height);
    
force
    .nodes(graph.nodes)
    .links(graph.links)
    .start();

var link = svg.selectAll(".link")
    .data(graph.links)
    .enter().append("line")
    .attr("class", "link")
    .style("stroke-width", function(d) { return Math.sqrt(d.value)/3; });

var node = svg.selectAll(".node")
    .data(graph.nodes)
    .enter().append("g")
    .attr("class", "node")
    .call(force.drag);

node.append("circle")
    .attr("r", function(d) { return (Math.sqrt(d.importance)*30)/Math.sqrt(98.7695771886648); })
    .style("fill", function (d) {
    if (d.name.startsWith("other")) { return color(1); } else { return color(2); };
})

node.append("text")
      .attr("dx", function(d) { return (Math.sqrt(d.importance)*30)/Math.sqrt(26.343032735543023); })
      .attr("dy", ".35em")
      .text(function(d) { return d.name });
      
//Now we are giving the SVGs co-ordinates - the force layout is generating the co-ordinates which this code is using to update the attributes of the SVG elements
force.on("tick", function () {
    link.attr("x1", function (d) {
        return d.source.x;
    })
        .attr("y1", function (d) {
        return d.source.y;
    })
        .attr("x2", function (d) {
        return d.target.x;
    })
        .attr("y2", function (d) {
        return d.target.y;
    });
    d3.selectAll("circle").attr("cx", function (d) {
        return d.x;
    })
        .attr("cy", function (d) {
        return d.y;
    });
    d3.selectAll("text").attr("x", function (d) {
        return d.x;
    })
        .attr("y", function (d) {
        return d.y;
    });
});
</script>
</div>
""")
}
  
  def help() = {
displayHTML("""
<p>
Produces a force-directed graph given a collection of edges of the following form:</br>
<tt><font color="#a71d5d">case class</font> <font color="#795da3">Edge</font>(<font color="#ed6a43">src</font>: <font color="#a71d5d">String</font>, <font color="#ed6a43">dest</font>: <font color="#a71d5d">String</font>, <font color="#ed6a43">count</font>: <font color="#a71d5d">Long</font>)</tt>
</p>
<p>Usage:<br/>
<tt><font color="#a71d5d">import</font> <font color="#ed6a43">d3._</font></tt><br/>
<tt><font color="#795da3">graphs.force</font>(</br>
&nbsp;&nbsp;<font color="#ed6a43">height</font> = <font color="#795da3">500</font>,<br/>
&nbsp;&nbsp;<font color="#ed6a43">width</font> = <font color="#795da3">500</font>,<br/>
&nbsp;&nbsp;<font color="#ed6a43">clicks</font>: <font color="#795da3">Dataset</font>[<font color="#795da3">Edge</font>])</tt>
</p>""")
  }
}
graphs.force(
  height = 800,
  width = 1200,
  clicks = toplabel3FiltE.edges.as[Edge],
  vertices = toplabel3FiltE.vertices.select($"id".as("name"),$"pagerank".as("importance")).as[Node]
  )

malaysian_cluster

ScaDaMaLe Course site and book

Plugging into GDELT Mass Media Streams

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


This is just a brief teaser into the world of the GDELT-project: * https://www.gdeltproject.org/

This exposition was originally from Mastering Spark for Data Science which we will try to dive into in the geospatial modules.

We will use a spark-gdelt library for Spark 3.0.1:

Also see the following that this work build on:

  • https://github.com/lamastex/spark-gdelt-examples

What is The GDELT-Project?

From https://www.gdeltproject.org/

Watching our World Unfold

A Global Database of Society

Supported by Google Jigsaw, the GDELT Project monitors the world's broadcast, print, and web news from nearly every corner of every country in over 100 languages and identifies the people, locations, organizations, themes, sources, emotions, counts, quotes, images and events driving our global society every second of every day, creating a free open platform for computing on the entire world.

spinningglobe gdelt project

import com.aamend.spark.gdelt._
import org.apache.spark.sql.Dataset
import spark.implicits._
import com.aamend.spark.gdelt._
import org.apache.spark.sql.Dataset
import spark.implicits._
ls dbfs:/datasets/ScaDaMaLe/GDELT/
path name size
dbfs:/datasets/ScaDaMaLe/GDELT/EventV1/ EventV1/ 0.0
dbfs:/datasets/ScaDaMaLe/GDELT/EventV2/ EventV2/ 0.0
dbfs:/datasets/ScaDaMaLe/GDELT/GKGV1/ GKGV1/ 0.0
dbfs:/datasets/ScaDaMaLe/GDELT/GKGV2/ GKGV2/ 0.0

Download from gdelt-project

Just a pinky toe dip in this ocean of information!

Here we just show briefly how you can download the zipped CSV files from the GDELT-project and turn them into DataSets using the libraries we have contributed to.

In order to scalably analyze this data one needs delta.io tables (just a couple more steps to turn these SparkSQL datasets actually). We use such delta.io tables we have built in another project to present more interesting detection of events and entities of interest in the sequel.

Note first that there are several types of GDELT datasets. There is indeed a large set of code-books and manuals one should first familiarise oneself with before trying to understand what all the structured data means - it is fairly detailed. These are available from manual searches through the gdelt project pages.

Here is a collection of them that will be immediately and minimally necessary to make cohesive sense of a large fraction of the datasets available to anyone:

ls dbfs:///datasets/ScaDaMaLe/GDELT/
path name size
dbfs:/datasets/ScaDaMaLe/GDELT/20180416121500.gkg.csv 20180416121500.gkg.csv 3.4277858e7
dbfs:/datasets/ScaDaMaLe/GDELT/20190517121500.gkg.csv 20190517121500.gkg.csv 2.5728991e7
dbfs:/datasets/ScaDaMaLe/GDELT/20190523121500.gkg.csv 20190523121500.gkg.csv 2.9695688e7
dbfs:/datasets/ScaDaMaLe/GDELT/EventV1/ EventV1/ 0.0
dbfs:/datasets/ScaDaMaLe/GDELT/EventV2/ EventV2/ 0.0
dbfs:/datasets/ScaDaMaLe/GDELT/GKGV1/ GKGV1/ 0.0
dbfs:/datasets/ScaDaMaLe/GDELT/GKGV2/ GKGV2/ 0.0
curl -O http://data.gdeltproject.org/gdeltv2/20190517121500.gkg.csv.zip
curl -O http://data.gdeltproject.org/gdeltv2/20190523121500.gkg.csv.zip
curl -O http://data.gdeltproject.org/gdeltv2/20180416121500.gkg.csv.zip
unzip 20190517121500.gkg.csv.zip
unzip 20190523121500.gkg.csv.zip
unzip 20180416121500.gkg.csv.zip
cp "file:///databricks/driver/20180416121500.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/EventV2/20180416121500.gkg.csv"
res2: Boolean = true
cp "file:///databricks/driver/20190523121500.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/EventV2/20190523121500.gkg.csv"
res3: Boolean = true
cp "file:///databricks/driver/20190517121500.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/EventV2/20190517121500.gkg.csv"
res4: Boolean = true
val gdeltEventV2DS: Dataset[GKGEventV2] = spark.read.gdeltGkgV2("dbfs:/datasets/ScaDaMaLe/GDELT/EventV2/20190523121500.gkg.csv")
gdeltEventV2DS: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.GKGEventV2] = [gkgRecordId: struct<publishDate: timestamp, translingual: boolean ... 1 more field>, publishDate: timestamp ... 27 more fields]
display(gdeltEventV2DS)
curl -O http://data.gdeltproject.org/gdeltv2/20180417121500.gkg.csv.zip
curl -O http://data.gdeltproject.org/gdeltv2/20190523121500.gkg.csv.zip
curl -O http://data.gdeltproject.org/gdeltv2/20190517121500.gkg.csv.zip
ls dbfs:///datasets/ScaDaMaLe/GDELT/
unzip 20180417121500.gkg.csv.zip
unzip 20190523121500.gkg.csv.zip
unzip 20190517121500.gkg.csv.zip
cp "file:///databricks/driver/20180417121500.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/GKGV2/20180417121500.gkg.csv"
res8: Boolean = true
cp "file:///databricks/driver/20190523121500.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/GKGV2/20190523121500.gkg.csv"
res9: Boolean = true
cp "file:///databricks/driver/20190517121500.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/GKGV2/20190517121500.gkg.csv"
res10: Boolean = true
val gdeltGkgV2DS: Dataset[GKGEventV2] = spark.read.gdeltGkgV2("dbfs:///datasets/ScaDaMaLe/GDELT/GKGV2/20190517121500.gkg.csv")
gdeltGkgV2DS: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.GKGEventV2] = [gkgRecordId: struct<publishDate: timestamp, translingual: boolean ... 1 more field>, publishDate: timestamp ... 27 more fields]
display(gdeltGkgV2DS)
curl -O http://data.gdeltproject.org/gkg/20190517.gkg.csv.zip
curl -O http://data.gdeltproject.org/gkg/20190523.gkg.csv.zip
curl -O http://data.gdeltproject.org/gkg/20180417.gkg.csv.zip
unzip 20190517.gkg.csv.zip
unzip 20190523.gkg.csv.zip
unzip 20180417.gkg.csv.zip
cp "file:///databricks/driver/20180417.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/GKGV1/20180417.gkg.csv"
res32: Boolean = true
cp "file:///databricks/driver//20190523.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/GKGV1/20190523.gkg.csv"
cp "file:///databricks/driver/20190517.gkg.csv" "dbfs:///datasets/ScaDaMaLe/GDELT/GKGV1/20190517.gkg.csv"
val gdeltGkgV1DS: Dataset[GKGEventV1] = spark.read.gdeltGkgV1("dbfs:///datasets/ScaDaMaLe/GDELT/GKGV1/20190517.gkg.csv")
gdeltGkgV1DS: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.GKGEventV1] = [publishDate: timestamp, numArticles: int ... 11 more fields]
display(gdeltGkgV1DS)
curl -O http://data.gdeltproject.org/events/20190517.export.CSV.zip
curl -O http://data.gdeltproject.org/events/20190523.export.CSV.zip
curl -O http://data.gdeltproject.org/events/20180416.export.CSV.zip
unzip 20190517.export.CSV.zip
unzip 20190523.export.CSV.zip
unzip 20180416.export.CSV.zip
cp "file:///databricks/driver/20190517.export.CSV" "dbfs:///datasets/ScaDaMaLe/GDELT/EventV1/20190517.export.CSV"
res34: Boolean = true
cp "file:///databricks/driver/20190523.export.CSV" "dbfs:///datasets/ScaDaMaLe/GDELT/EventV1/20190523.export.CSV"
cp "file:///databricks/driver/20180416.export.CSV" "dbfs:///datasets/ScaDaMaLe/GDELT/EventV1/20180416.export.CSV"

We have now downloaded and ingested the data in our distributed file store. It is time for some structured data processing using Datasets in Spark!

GDELT SparkSQL Datasets

Let us turn these ingested CSV files into Spark Datasets next.

val gdeltEventV1DS: Dataset[EventV1] = spark.read.gdeltEventV1("dbfs:///datasets/ScaDaMaLe/GDELT/EventV1/20180416.export.CSV")
gdeltEventV1DS: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.EventV1] = [eventId: int, eventDay: date ... 19 more fields]
display(gdeltEventV1DS)

Let's look a the locations field.

We want to be able to filter by a country.

val gdeltGkgDS: Dataset[GKGEventV2] = spark.read.gdeltGkgV2("dbfs:///datasets/ScaDaMaLe/GDELT/GKGV2/20190517121500.gkg.csv")
gdeltGkgDS: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.GKGEventV2] = [gkgRecordId: struct<publishDate: timestamp, translingual: boolean ... 1 more field>, publishDate: timestamp ... 27 more fields]
display(gdeltGkgDS.select($"locations"))
val USgdeltGkgDS = gdeltGkgDS.withColumn("loc",$"locations"(0))
          .filter($"loc.countryCode" contains "US").drop("loc")
USgdeltGkgDS: org.apache.spark.sql.DataFrame = [gkgRecordId: struct<publishDate: timestamp, translingual: boolean ... 1 more field>, publishDate: timestamp ... 27 more fields]
val IEgdeltGkgDS = gdeltGkgDS.withColumn("loc",$"locations"(0))
          .filter($"loc.countryCode" contains "IE").drop("loc")
IEgdeltGkgDS: org.apache.spark.sql.DataFrame = [gkgRecordId: struct<publishDate: timestamp, translingual: boolean ... 1 more field>, publishDate: timestamp ... 27 more fields]
IEgdeltGkgDS.count
res17: Long = 0
USgdeltGkgDS.count
res18: Long = 549
display(USgdeltGkgDS)

GDELT Reference data

Here are various code-books used in the GDELT project. They are nicely available for you through the spark-gdelt library we have already loaded.

val countryCodes: Dataset[CountryCode] = spark.loadCountryCodes
val gcam: Dataset[GcamCode] = spark.loadGcams
val cameoEvent: Dataset[CameoCode] = spark.loadCameoEventCodes
val cameoType: Dataset[CameoCode] = spark.loadCameoTypeCodes
val cameoGroup: Dataset[CameoCode] = spark.loadCameoGroupCodes
val cameoEthnic: Dataset[CameoCode] = spark.loadCameoEthnicCodes
val cameoReligion: Dataset[CameoCode] = spark.loadCameoReligionCodes
val cameoCountry: Dataset[CameoCode] = spark.loadCameoCountryCodes
countryCodes: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.CountryCode] = [iso: string, iso3: string ... 3 more fields]
gcam: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.GcamCode] = [gcamCode: string, dictionaryId: string ... 6 more fields]
cameoEvent: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.CameoCode] = [cameoCode: string, cameoValue: string]
cameoType: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.CameoCode] = [cameoCode: string, cameoValue: string]
cameoGroup: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.CameoCode] = [cameoCode: string, cameoValue: string]
cameoEthnic: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.CameoCode] = [cameoCode: string, cameoValue: string]
cameoReligion: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.CameoCode] = [cameoCode: string, cameoValue: string]
cameoCountry: org.apache.spark.sql.Dataset[com.aamend.spark.gdelt.CameoCode] = [cameoCode: string, cameoValue: string]
display(countryCodes)
iso iso3 isoNumeric fips country
AD AND 020 AN andorra
AE ARE 784 AE united arab emirates
AF AFG 004 AF afghanistan
AG ATG 028 AC antigua and barbuda
AI AIA 660 AV anguilla
AL ALB 008 AL albania
AM ARM 051 AM armenia
AO AGO 024 AO angola
AQ ATA 010 AY antarctica
AR ARG 032 AR argentina
AS ASM 016 AQ american samoa
AT AUT 040 AU austria
AU AUS 036 AS australia
AW ABW 533 AA aruba
AX ALA 248 aland islands
AZ AZE 031 AJ azerbaijan
BA BIH 070 BK bosnia and herzegovina
BB BRB 052 BB barbados
BD BGD 050 BG bangladesh
BE BEL 056 BE belgium
BF BFA 854 UV burkina faso
BG BGR 100 BU bulgaria
BH BHR 048 BA bahrain
BI BDI 108 BY burundi
BJ BEN 204 BN benin
BL BLM 652 TB saint barthelemy
BM BMU 060 BD bermuda
BN BRN 096 BX brunei
BO BOL 068 BL bolivia
BQ BES 535 bonaire, saint eustatius and saba
BR BRA 076 BR brazil
BS BHS 044 BF bahamas
BT BTN 064 BT bhutan
BV BVT 074 BV bouvet island
BW BWA 072 BC botswana
BY BLR 112 BO belarus
BZ BLZ 084 BH belize
CA CAN 124 CA canada
CC CCK 166 CK cocos islands
CD COD 180 CG democratic republic of the congo
CF CAF 140 CT central african republic
CG COG 178 CF republic of the congo
CH CHE 756 SZ switzerland
CI CIV 384 IV ivory coast
CK COK 184 CW cook islands
CL CHL 152 CI chile
CM CMR 120 CM cameroon
CN CHN 156 CH china
CO COL 170 CO colombia
CR CRI 188 CS costa rica
CU CUB 192 CU cuba
CV CPV 132 CV cape verde
CW CUW 531 UC curacao
CX CXR 162 KT christmas island
CY CYP 196 CY cyprus
CZ CZE 203 EZ czechia
DE DEU 276 GM germany
DJ DJI 262 DJ djibouti
DK DNK 208 DA denmark
DM DMA 212 DO dominica
DO DOM 214 DR dominican republic
DZ DZA 012 AG algeria
EC ECU 218 EC ecuador
EE EST 233 EN estonia
EG EGY 818 EG egypt
EH ESH 732 WI western sahara
ER ERI 232 ER eritrea
ES ESP 724 SP spain
ET ETH 231 ET ethiopia
FI FIN 246 FI finland
FJ FJI 242 FJ fiji
FK FLK 238 FK falkland islands
FM FSM 583 FM micronesia
FO FRO 234 FO faroe islands
FR FRA 250 FR france
GA GAB 266 GB gabon
GB GBR 826 UK united kingdom
GD GRD 308 GJ grenada
GE GEO 268 GG georgia
GF GUF 254 FG french guiana
GG GGY 831 GK guernsey
GH GHA 288 GH ghana
GI GIB 292 GI gibraltar
GL GRL 304 GL greenland
GM GMB 270 GA gambia
GN GIN 324 GV guinea
GP GLP 312 GP guadeloupe
GQ GNQ 226 EK equatorial guinea
GR GRC 300 GR greece
GS SGS 239 SX south georgia and the south sandwich islands
GT GTM 320 GT guatemala
GU GUM 316 GQ guam
GW GNB 624 PU guinea-bissau
GY GUY 328 GY guyana
HK HKG 344 HK hong kong
HM HMD 334 HM heard island and mcdonald islands
HN HND 340 HO honduras
HR HRV 191 HR croatia
HT HTI 332 HA haiti
HU HUN 348 HU hungary
ID IDN 360 ID indonesia
IE IRL 372 EI ireland
IL ISR 376 IS israel
IM IMN 833 IM isle of man
IN IND 356 IN india
IO IOT 086 IO british indian ocean territory
IQ IRQ 368 IZ iraq
IR IRN 364 IR iran
IS ISL 352 IC iceland
IT ITA 380 IT italy
JE JEY 832 JE jersey
JM JAM 388 JM jamaica
JO JOR 400 JO jordan
JP JPN 392 JA japan
KE KEN 404 KE kenya
KG KGZ 417 KG kyrgyzstan
KH KHM 116 CB cambodia
KI KIR 296 KR kiribati
KM COM 174 CN comoros
KN KNA 659 SC saint kitts and nevis
KP PRK 408 KN north korea
KR KOR 410 KS south korea
XK XKX 0 KV kosovo
KW KWT 414 KU kuwait
KY CYM 136 CJ cayman islands
KZ KAZ 398 KZ kazakhstan
LA LAO 418 LA laos
LB LBN 422 LE lebanon
LC LCA 662 ST saint lucia
LI LIE 438 LS liechtenstein
LK LKA 144 CE sri lanka
LR LBR 430 LI liberia
LS LSO 426 LT lesotho
LT LTU 440 LH lithuania
LU LUX 442 LU luxembourg
LV LVA 428 LG latvia
LY LBY 434 LY libya
MA MAR 504 MO morocco
MC MCO 492 MN monaco
MD MDA 498 MD moldova
ME MNE 499 MJ montenegro
MF MAF 663 RN saint martin
MG MDG 450 MA madagascar
MH MHL 584 RM marshall islands
MK MKD 807 MK macedonia
ML MLI 466 ML mali
MM MMR 104 BM myanmar
MN MNG 496 MG mongolia
MO MAC 446 MC macao
MP MNP 580 CQ northern mariana islands
MQ MTQ 474 MB martinique
MR MRT 478 MR mauritania
MS MSR 500 MH montserrat
MT MLT 470 MT malta
MU MUS 480 MP mauritius
MV MDV 462 MV maldives
MW MWI 454 MI malawi
MX MEX 484 MX mexico
MY MYS 458 MY malaysia
MZ MOZ 508 MZ mozambique
NA NAM 516 WA namibia
NC NCL 540 NC new caledonia
NE NER 562 NG niger
NF NFK 574 NF norfolk island
NG NGA 566 NI nigeria
NI NIC 558 NU nicaragua
NL NLD 528 NL netherlands
NO NOR 578 NO norway
NP NPL 524 NP nepal
NR NRU 520 NR nauru
NU NIU 570 NE niue
NZ NZL 554 NZ new zealand
OM OMN 512 MU oman
PA PAN 591 PM panama
PE PER 604 PE peru
PF PYF 258 FP french polynesia
PG PNG 598 PP papua new guinea
PH PHL 608 RP philippines
PK PAK 586 PK pakistan
PL POL 616 PL poland
PM SPM 666 SB saint pierre and miquelon
PN PCN 612 PC pitcairn
PR PRI 630 RQ puerto rico
PS PSE 275 WE palestinian territory
PT PRT 620 PO portugal
PW PLW 585 PS palau
PY PRY 600 PA paraguay
QA QAT 634 QA qatar
RE REU 638 RE reunion
RO ROU 642 RO romania
RS SRB 688 RI serbia
RU RUS 643 RS russia
RW RWA 646 RW rwanda
SA SAU 682 SA saudi arabia
SB SLB 090 BP solomon islands
SC SYC 690 SE seychelles
SD SDN 729 SU sudan
SS SSD 728 OD south sudan
SE SWE 752 SW sweden
SG SGP 702 SN singapore
SH SHN 654 SH saint helena
SI SVN 705 SI slovenia
SJ SJM 744 SV svalbard and jan mayen
SK SVK 703 LO slovakia
SL SLE 694 SL sierra leone
SM SMR 674 SM san marino
SN SEN 686 SG senegal
SO SOM 706 SO somalia
SR SUR 740 NS suriname
ST STP 678 TP sao tome and principe
SV SLV 222 ES el salvador
SX SXM 534 NN sint maarten
SY SYR 760 SY syria
SZ SWZ 748 WZ swaziland
TC TCA 796 TK turks and caicos islands
TD TCD 148 CD chad
TF ATF 260 FS french southern territories
TG TGO 768 TO togo
TH THA 764 TH thailand
TJ TJK 762 TI tajikistan
TK TKL 772 TL tokelau
TL TLS 626 TT east timor
TM TKM 795 TX turkmenistan
TN TUN 788 TS tunisia
TO TON 776 TN tonga
TR TUR 792 TU turkey
TT TTO 780 TD trinidad and tobago
TV TUV 798 TV tuvalu
TW TWN 158 TW taiwan
TZ TZA 834 TZ tanzania
UA UKR 804 UP ukraine
UG UGA 800 UG uganda
UM UMI 581 united states minor outlying islands
US USA 840 US united states
UY URY 858 UY uruguay
UZ UZB 860 UZ uzbekistan
VA VAT 336 VT vatican
VC VCT 670 VC saint vincent and the grenadines
VE VEN 862 VE venezuela
VG VGB 092 VI british virgin islands
VI VIR 850 VQ u.s. virgin islands
VN VNM 704 VM vietnam
VU VUT 548 NH vanuatu
WF WLF 876 WF wallis and futuna
WS WSM 882 WS samoa
YE YEM 887 YM yemen
YT MYT 175 MF mayotte
ZA ZAF 710 SF south africa
ZM ZMB 894 ZA zambia
ZW ZWE 716 ZI zimbabwe
CS SCG 891 YI serbia and montenegro
AN ANT 530 NT netherlands antilles
display(cameoEvent)
cameoCode cameoValue
01 make public statement
010 make statement, not specified below
011 decline comment
012 make pessimistic comment
013 make optimistic comment
014 consider policy option
015 acknowledge or claim responsibility
016 deny responsibility
017 engage in symbolic act
018 make empathetic comment
019 express accord
02 appeal
020 appeal, not specified below
021 appeal for material cooperation, not specified below
0211 appeal for economic cooperation
0212 appeal for military cooperation
0213 appeal for judicial cooperation
0214 appeal for intelligence
022 appeal for diplomatic cooperation, such as policy support
023 appeal for aid, not specified below
0231 appeal for economic aid
0232 appeal for military aid
0233 appeal for humanitarian aid
0234 appeal for military protection or peacekeeping
024 appeal for political reform, not specified below
0241 appeal for change in leadership
0242 appeal for policy change
0243 appeal for rights
0244 appeal for change in institutions, regime
025 appeal to yield
0251 appeal for easing of administrative sanctions
0252 appeal for easing of popular dissent
0253 appeal for release of persons or property
0254 appeal for easing of economic sanctions, boycott, or embargo
0255 appeal for target to allow international involvement (non-mediation)
0256 appeal for de-escalation of military engagement
026 appeal to others to meet or negotiate
027 appeal to others to settle dispute
028 appeal to others to engage in or accept mediation
03 express intent to cooperate
030 express intent to cooperate, not specified below
031 express intent to engage in material cooperation, not specified below
0311 express intent to cooperate economically
0312 express intent to cooperate militarily
0313 express intent to cooperate on judicial matters
0314 express intent to cooperate on intelligence
032 express intent to provide diplomatic cooperation such as policy support
033 express intent to provide matyerial aid, not specified below
0331 express intent to provide economic aid
0332 express intent to provide military aid
0333 express intent to provide humanitarian aid
0334 express intent to provide military protection or peacekeeping
034 express intent to institute political reform, not specified below
0341 express intent to change leadership
0342 express intent to change policy
0343 express intent to provide rights
0344 express intent to change institutions, regime
035 express intent to yield, not specified below
0351 express intent to ease administrative sanctions
0352 express intent to ease popular dissent
0353 express intent to release persons or property
0354 express intent to ease economic sanctions, boycott, or embargo
0355 express intent allow international involvement (not mediation)
0356 express intent to de-escalate military engagement
036 express intent to meet or negotiate
037 express intent to settle dispute
038 express intent to accept mediation
039 express intent to mediate
04 consult
040 consult, not specified below
041 discuss by telephone
042 make a visit
043 host a visit
044 meet at a ã’hirdã“location
045 mediate
046 engage in negotiation
05 engage in diplomatic cooperation
050 engage in diplomatic cooperation, not specified below
051 praise or endorse
052 defend verbally
053 rally support on behalf of
054 grant diplomatic recognition
055 apologize
056 forgive
057 sign formal agreement
06 engage in material cooperation
060 engage in material cooperation, not specified below
061 cooperate economically
062 cooperate militarily
063 engage in judicial cooperation
064 share intelligence or information
07 provide aid
070 provide aid, not specified below
071 provide economic aid
072 provide military aid
073 provide humanitarian aid
074 provide military protection or peacekeeping
075 grant asylum
08 yield
080 yield, not specified below
081 ease administrative sanctions, not specified below
0811 ease restrictions on political freedoms
0812 ease ban on political parties or politicians
0813 ease curfew
0814 ease state of emergency or martial law
082 ease political dissent
083 accede to requests or demands for political reform not specified below
0831 accede to demands for change in leadership
0832 accede to demands for change in policy
0833 accede to demands for rights
0834 accede to demands for change in institutions, regime
084 return, release, not specified below
0841 return, release person(s)
0842 return, release property
085 ease economic sanctions, boycott, embargo
086 allow international involvement not specified below
0861 receive deployment of peacekeepers
0862 receive inspectors
0863 allow delivery of humanitarian aid
087 de-escalate military engagement
0871 declare truce, ceasefire
0872 ease military blockade
0873 demobilize armed forces
0874 retreat or surrender militarily
09 investigate
090 investigate, not specified below
091 investigate crime, corruption
092 investigate human rights abuses
093 investigate military action
094 investigate war crimes
10 demand
100 demand, not specified below
101 demand information, investigation
1011 demand economic cooperation
1012 demand military cooperation
1013 demand judicial cooperation
1014 demand intelligence cooperation
102 demand policy support
103 demand aid, protection, or peacekeeping
1031 demand economic aid
1032 demand military aid
1033 demand humanitarian aid
1034 demand military protection or peacekeeping
104 demand political reform, not specified below
1041 demand change in leadership
1042 demand policy change
1043 demand rights
1044 demand change in institutions, regime
105 demand mediation
1051 demand easing of administrative sanctions
1052 demand easing of political dissent
1053 demand release of persons or property
1054 demand easing of economic sanctions, boycott, or embargo
1055 demand that target allows international involvement (non-mediation)
1056 demand de-escalation of military engagement106:[-5.0] demand withdrawal
107 demand ceasefire
108 demand meeting, negotiation
11 disapprove
110 disapprove, not specified below
111 criticize or denounce
112 accuse, not specified below
1121 accuse of crime, corruption
1122 accuse of human rights abuses
1123 accuse of aggression
1124 accuse of war crimes
1125 accuse of espionage, treason
113 rally opposition against
114 complain officially
115 bring lawsuit against
116 find guilty or liable (legally)
12 reject
120 reject, not specified below
121 reject material cooperation
1211 reject economic cooperation
1212 reject military cooperation
122 reject request or demand for material aid, not specified below
1221 reject request for economic aid
1222 reject request for military aid
1223 reject request for humanitarian aid
1224 reject request for military protection or peacekeeping
123 reject request or demand for political reform, not specified below
1231 reject request for change in leadership
1232 reject request for policy change
1233 reject request for rights
1234 reject request for change in institutions, regime
124 refuse to yield, not specified below
1241 refuse to ease administrative sanctions
1242 refuse to ease popular dissent
1243 refuse to release persons or property
1244 refuse to ease economic sanctions, boycott, or embargo
1245 refuse to allow international involvement (non mediation)
1246 refuse to de-escalate military engagement
125 reject proposal to meet, discuss, or negotiate
126 reject mediation
127 reject plan, agreement to settle dispute
128 defy norms, law
129 veto
13 threaten
130 threaten, not specified below
131 threaten non-force, not specified below
1311 threaten to reduce or stop aid
1312 threaten to boycott, embargo, or sanction
1313 threaten to reduce or break relations
132 threaten with administrative sanctions, not specified below
1321 threaten to impose restrictions on political freedoms
1322 threaten to ban political parties or politicians
1323 threaten to impose curfew
1324 threaten to impose state of emergency or martial law
133 threaten political dissent, protest
134 threaten to halt negotiations
135 threaten to halt mediation
136 threaten to halt international involvement (non-mediation)
137 threaten with violent repression
138 threaten to use military force, not specified below
1381 threaten blockade
1382 threaten occupation
1383 threaten unconventional violence
1384 threaten conventional attack
1385 threaten attack with wmd
139 give ultimatum
14 protest
140 engage in political dissent, not specified below
141 demonstrate or rally
1411 demonstrate for leadership change
1412 demonstrate for policy change
1413 demonstrate for rights
1414 demonstrate for change in institutions, regime
142 conduct hunger strike, not specified below
1421 conduct hunger strike for leadership change
1422 conduct hunger strike for policy change
1423 conduct hunger strike for rights
1424 conduct hunger strike for change in institutions, regime
143 conduct strike or boycott, not specified below
1431 conduct strike or boycott for leadership change
1432 conduct strike or boycott for policy change
1433 conduct strike or boycott for rights
1434 conduct strike or boycott for change in institutions, regime
144 obstruct passage, block
1441 obstruct passage to demand leadership change
1442 obstruct passage to demand policy change
1443 obstruct passage to demand rights
1444 obstruct passage to demand change in institutions, regime
145 protest violently, riot
1451 engage in violent protest for leadership change
1452 engage in violent protest for policy change
1453 engage in violent protest for rights
1454 engage in violent protest for change in institutions, regime
15 exhibit force posture
150 demonstrate military or police power, not specified below
151 increase police alert status
152 increase military alert status
153 mobilize or increase police power
154 mobilize or increase armed forces
16 reduce relations
160 reduce relations, not specified below
161 reduce or break diplomatic relations
162 reduce or stop aid, not specified below
1621 reduce or stop economic assistance
1622 reduce or stop military assistance
1623 reduce or stop humanitarian assistance
163 impose embargo, boycott, or sanctions
164 halt negotiations
165 halt mediation
166 expel or withdraw, not specified below
1661 expel or withdraw peacekeepers
1662 expel or withdraw inspectors, observers
1663 expel or withdraw aid agencies
17 coerce
170 coerce, not specified below
171 seize or damage property, not specified below
1711 confiscate property
1712 destroy property
172 impose administrative sanctions, not specified below
1721 impose restrictions on political freedoms
1722 ban political parties or politicians
1723 impose curfew
1724 impose state of emergency or martial law
173 arrest, detain, or charge with legal action
174 expel or deport individuals
175 use tactics of violent repression
18 assault
180 use unconventional violence, not specified below
181 abduct, hijack, or take hostage
182 physically assault, not specified below
1821 sexually assault
1822 torture
1823 kill by physical assault
183 conduct suicide, car, or other non-military bombing, not spec below
1831 carry out suicide bombing
1832 carry out car bombing
1833 carry out roadside bombing
184 use as human shield
185 attempt to assassinate
186 assassinate
19 fight
190 use conventional military force, not specified below
191 impose blockade, restrict movement
192 occupy territory
193 fight with small arms and light weapons
194 fight with artillery and tanks
195 employ aerial weapons
196 violate ceasefire
20 use unconventional mass violence
200 use unconventional mass violence, not specified below
201 engage in mass expulsion
202 engage in mass killings
203 engage in ethnic cleansing
204 use weapons of mass destruction, not specified below
2041 use chemical, biological, or radiologicalweapons
2042 detonate nuclear weapons
ls dbfs:/datasets/ScaDaMaLe/GDELT/
path name size
dbfs:/datasets/ScaDaMaLe/GDELT/20180416121500.gkg.csv 20180416121500.gkg.csv 3.4277858e7
dbfs:/datasets/ScaDaMaLe/GDELT/20190517121500.gkg.csv 20190517121500.gkg.csv 2.5728991e7
dbfs:/datasets/ScaDaMaLe/GDELT/20190523121500.gkg.csv 20190523121500.gkg.csv 2.9695688e7

ScaDaMaLe Course site and book

Markov Model for Trend Calculus

Johannes Graner (LinkedIn), Albert Nilsson (LinkedIn) and Raazesh Sainudiin (LinkedIn)

2020, Uppsala, Sweden

This project was supported by Combient Mix AB through summer internships at:

Combient Competence Centre for Data Engineering Sciences, Department of Mathematics, Uppsala University, Uppsala, Sweden


We use the dataset generated in the last notebook to build a simple, proof of concept Markov model for predicting trends.

This is merely to verify using a simple model if there is indeed any predictive value in the trends and their reversals that many real-worl traders bet on every instant in the market.

Resources

This builds on the following library and its antecedents therein:

This work was inspired by:

"./000a_finance_utils"
import java.sql.Timestamp
import io.delta.tables._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.streaming.{GroupState, GroupStateTimeout, OutputMode, Trigger}
import org.apache.spark.sql.types._
import org.apache.spark.sql.expressions.{Window, WindowSpec}
import org.lamastex.spark.trendcalculus._
import scala.util.Random
import java.sql.Timestamp
import io.delta.tables._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.streaming.{GroupState, GroupStateTimeout, OutputMode, Trigger}
import org.apache.spark.sql.types._
import org.apache.spark.sql.expressions.{Window, WindowSpec}
import org.lamastex.spark.trendcalculus._
import scala.util.Random
defined object TrendUtils

Reading the joined dataset from the last notebook.

We train the model using both oil and gold data and predict trends in oil data. We show that this yields better results than just training on the oil data.

val rootPath = TrendUtils.getStreamableTrendCalculusPath
val maxRevPath = rootPath + "maxRev"
val maxRevDS = spark.read.format("delta").load(maxRevPath).as[FlatReversal]

We want to predict what the trend of the next data point will be given the trend reversals we have observed.

For this, we use an m-th order Markov model. We look at the reversal state of the last m points and use this to predict the trends in the next n points. k is the maximum order of reversal that is considered when training the model.

trainingRatio is the ratio of the data used for training the model, the rest is used for testing.

val modelPath = rootPath + "estimators/"
val maxRevDSWithLagCountPath = modelPath + "maxRevDSWithLag"

val numPartitions = 10
val partialModelPaths = (1 to numPartitions).map( i => modelPath + s"partialModel${i}" )
val fullModelPath = modelPath + "fullModel"

val m = 10
val n = 1
val k = 18
val trainingRatio = 0.7
type FinalModel = Map[Seq[Int], Map[Seq[Int], Double]]
def truncRev(k: Int)(rev: Int): Int = {
  if (math.abs(rev) > k) k*rev.signum else rev
}
val truncRevUDF = udf{ rev: Int => rev.signum }
def truncRevsUDF(k: Int) = udf{ revs: Seq[Int] => revs.map(truncRev(k)) }

def lagColumn(df: DataFrame, orderColumnName: String, lagKeyName: String, lagValueName: String, m: Int, n: Int): DataFrame = {
  val windowSpec = Window.partitionBy("ticker").orderBy(orderColumnName)
  val laggedKeyColNames = (1 to m).map( i => s"lagKey$i" ).toSeq
  val laggedValueColNames = (1 to n).map( i => s"lagValue$i" ).toSeq
  val dfWithLaggedKeyColumns = (n+1 to m+n)
    .foldLeft(df)( (df: DataFrame, i: Int) => df.withColumn(laggedKeyColNames(i-n-1), lag(lagKeyName, i-1, Int.MaxValue).over(windowSpec)) )
  val dfWithLaggedKeyValueColumns = (1 to n)
    .foldLeft(dfWithLaggedKeyColumns)( (df: DataFrame, i: Int) => df.withColumn(laggedValueColNames(i-1), lag(lagValueName, i-1, Int.MaxValue).over(windowSpec)) )
  
  dfWithLaggedKeyValueColumns
    .withColumn("lagKey", array(laggedKeyColNames.reverse.take(m).map(col(_)):_*))
    .withColumn("lagValue", array(laggedValueColNames.reverse.takeRight(n).map(col(_)):_*))
    .withColumn("lagKeyFirst", col(laggedKeyColNames.last))
    .filter($"lagKeyFirst" =!= Int.MaxValue)
    .drop("lagKeyFirst")
    .drop(laggedKeyColNames:_*)
    .drop(laggedValueColNames:_*)
}
truncRev: (k: Int)(rev: Int)Int
truncRevUDF: org.apache.spark.sql.expressions.UserDefinedFunction = SparkUserDefinedFunction($Lambda$9209/1126103335@6b64702c,IntegerType,List(Some(class[value[0]: int])),None,false,true)
truncRevsUDF: (k: Int)org.apache.spark.sql.expressions.UserDefinedFunction
lagColumn: (df: org.apache.spark.sql.DataFrame, orderColumnName: String, lagKeyName: String, lagValueName: String, m: Int, n: Int)org.apache.spark.sql.DataFrame

The trend at each point can be extracted from the trend reversals by taking the sum of all previous 1-st order trend reversals. This sum will always be either 0 (up trend) or -1 (down trend) and 0 is therefore mapped to 1 to get (1, -1) as (up, down).

val maxRevDSWithLag = lagColumn(
  maxRevDS
    .orderBy("x")
    .toDF
    .withColumn("truncRev", truncRevUDF($"reversal"))
    .withColumn(
      "tmpTrend",
      sum("truncRev").over(
        Window
          .partitionBy("ticker")
          .orderBy("x")
          .rowsBetween(Window.unboundedPreceding, Window.currentRow)
      )
    )
    .withColumn("trend", when($"tmpTrend" === 0, 1).otherwise(-1))
    .drop("truncRev", "tmpTrend"),
  "x", 
  "reversal",
  "trend",
  m, 
  n
)
maxRevDSWithLag: org.apache.spark.sql.DataFrame = [ticker: string, x: timestamp ... 5 more fields]

We now want to predict lagValue from lagKey.

maxRevDSWithLag.show(20, false)
+------+-------------------+------+--------+-----+--------------------------------+--------+
|ticker|x                  |y     |reversal|trend|lagKey                          |lagValue|
+------+-------------------+------+--------+-----+--------------------------------+--------+
|XAUUSD|2009-03-15 18:09:00|925.4 |4       |1    |[-6, 0, 0, 1, -1, 0, 0, 0, 0, 0]|[1]     |
|XAUUSD|2009-03-15 18:10:00|925.75|-2      |-1   |[0, 0, 1, -1, 0, 0, 0, 0, 0, 4] |[-1]    |
|XAUUSD|2009-03-15 18:11:00|925.7 |0       |-1   |[0, 1, -1, 0, 0, 0, 0, 0, 4, -2]|[-1]    |
|XAUUSD|2009-03-15 18:12:00|925.65|1       |1    |[1, -1, 0, 0, 0, 0, 0, 4, -2, 0]|[1]     |
|XAUUSD|2009-03-15 18:13:00|925.65|0       |1    |[-1, 0, 0, 0, 0, 0, 4, -2, 0, 1]|[1]     |
|XAUUSD|2009-03-15 18:14:00|925.75|0       |1    |[0, 0, 0, 0, 0, 4, -2, 0, 1, 0] |[1]     |
|XAUUSD|2009-03-15 18:15:00|925.75|-1      |-1   |[0, 0, 0, 0, 4, -2, 0, 1, 0, 0] |[-1]    |
|XAUUSD|2009-03-15 18:16:00|925.65|0       |-1   |[0, 0, 0, 4, -2, 0, 1, 0, 0, -1]|[-1]    |
|XAUUSD|2009-03-15 18:17:00|925.6 |2       |1    |[0, 0, 4, -2, 0, 1, 0, 0, -1, 0]|[1]     |
|XAUUSD|2009-03-15 18:18:00|925.85|0       |1    |[0, 4, -2, 0, 1, 0, 0, -1, 0, 2]|[1]     |
|XAUUSD|2009-03-15 18:19:00|926.05|0       |1    |[4, -2, 0, 1, 0, 0, -1, 0, 2, 0]|[1]     |
|XAUUSD|2009-03-15 18:20:00|925.95|0       |1    |[-2, 0, 1, 0, 0, -1, 0, 2, 0, 0]|[1]     |
|XAUUSD|2009-03-15 18:21:00|926.55|0       |1    |[0, 1, 0, 0, -1, 0, 2, 0, 0, 0] |[1]     |
|XAUUSD|2009-03-15 18:22:00|926.95|0       |1    |[1, 0, 0, -1, 0, 2, 0, 0, 0, 0] |[1]     |
|XAUUSD|2009-03-15 18:23:00|927.25|0       |1    |[0, 0, -1, 0, 2, 0, 0, 0, 0, 0] |[1]     |
|XAUUSD|2009-03-15 18:24:00|927.45|-4      |-1   |[0, -1, 0, 2, 0, 0, 0, 0, 0, 0] |[-1]    |
|XAUUSD|2009-03-15 18:25:00|927.2 |0       |-1   |[-1, 0, 2, 0, 0, 0, 0, 0, 0, -4]|[-1]    |
|XAUUSD|2009-03-15 18:26:00|926.85|0       |-1   |[0, 2, 0, 0, 0, 0, 0, 0, -4, 0] |[-1]    |
|XAUUSD|2009-03-15 18:27:00|926.7 |0       |-1   |[2, 0, 0, 0, 0, 0, 0, -4, 0, 0] |[-1]    |
|XAUUSD|2009-03-15 18:28:00|926.5 |0       |-1   |[0, 0, 0, 0, 0, 0, -4, 0, 0, 0] |[-1]    |
+------+-------------------+------+--------+-----+--------------------------------+--------+
only showing top 20 rows

Cleaning up last run and writing model training input to delta tables.

dbutils.fs.rm(maxRevDSWithLagCountPath, recurse=true)

maxRevDSWithLag
  .withColumn("count", lit(1L))
  .write
  .format("delta")
  .mode("overwrite")
  .save(maxRevDSWithLagCountPath)
val divUDF = udf{ (a: Long, b: Long) => a.toDouble/b }
val maxRevDSWithLagCount = spark.read.format("delta").load(maxRevDSWithLagCountPath)
val numberOfRows = maxRevDSWithLagCount.count
divUDF: org.apache.spark.sql.expressions.UserDefinedFunction = SparkUserDefinedFunction($Lambda$9266/71459450@6a0e8651,DoubleType,List(Some(class[value[0]: bigint]), Some(class[value[0]: bigint])),None,false,true)
maxRevDSWithLagCount: org.apache.spark.sql.DataFrame = [ticker: string, x: timestamp ... 6 more fields]
numberOfRows: Long = 6845798

The data is split into training and testing data. This is not done randomly as there is a dependence on previous data points. We don't want to train on data that is dependent on the testing data and therefore the training data consists of the first (for example) 70% of the data and the last 30% is saved for testing. This also reflects how the model would be used since we can only train on data points that have already been observed.

val tickers = maxRevDSWithLagCount.select("ticker").distinct.as[String].collect.toSeq
val tickerDFs = tickers.map( ticker => maxRevDSWithLagCount.filter($"ticker" === ticker))
val trainingDF = tickerDFs.map( df => df.limit((df.count*trainingRatio).toInt) ).reduce( _.union(_) ).orderBy("x")
val trainingRows = trainingDF.count

val testingDF = maxRevDSWithLagCount.except(trainingDF)
tickers: Seq[String] = WrappedArray(XAUUSD, BCOUSD)
tickerDFs: Seq[org.apache.spark.sql.Dataset[org.apache.spark.sql.Row]] = ArrayBuffer([ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields])
trainingDF: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [ticker: string, x: timestamp ... 6 more fields]
trainingRows: Long = 4792058
testingDF: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [ticker: string, x: timestamp ... 6 more fields]

Create numTrainingSets training set of increasing size to get snapshots of how a partially trained model looks like. The sizes are spaced logarithmically since the improvement in the model is fastest in the beginning.

val rowsInPartitions = (1 to numPartitions).map{ i: Int => (math.exp(math.log(trainingRows)*i/numPartitions)).toInt }//.scanLeft(0.0)(_-_)
rowsInPartitions: scala.collection.immutable.IndexedSeq[Int] = Vector(4, 21, 100, 470, 2189, 10193, 47464, 221012, 1029129, 4792058)

Model is trained by counting how many times each (lagKey, lagValue) pair is observed and dividing by how many times lagKey is observed to get an estimation of the transition probabilities.

val keyValueCountPartialDFs = rowsInPartitions.map(
  trainingDF
    .limit(_)
    .withColumn("keyValueObs", sum("count").over(Window.partitionBy($"lagKey", $"lagValue")))
    .withColumn("totalKeyObs", sum("count").over(Window.partitionBy($"lagKey")))
    .drop("count")
)
keyValueCountPartialDFs: scala.collection.immutable.IndexedSeq[org.apache.spark.sql.DataFrame] = Vector([ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields], [ticker: string, x: timestamp ... 7 more fields])
keyValueCountPartialDFs.last.orderBy($"keyValueObs".desc).show(20,false)
+------+-------------------+------+--------+-----+------------------------------+--------+-----------+-----------+
|ticker|x                  |y     |reversal|trend|lagKey                        |lagValue|keyValueObs|totalKeyObs|
+------+-------------------+------+--------+-----+------------------------------+--------+-----------+-----------+
|XAUUSD|2009-03-15 19:41:00|926.55|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-17 05:27:00|921.73|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 04:51:00|928.75|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 04:52:00|928.7 |0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 04:53:00|928.68|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 06:24:00|925.1 |-5      |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 09:19:00|917.73|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 09:20:00|917.83|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 09:21:00|917.0 |0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 14:47:00|922.9 |0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 14:48:00|922.88|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 14:49:00|922.93|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 14:50:00|922.83|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 22:49:00|922.28|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 22:50:00|922.25|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 22:51:00|922.13|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-16 22:52:00|922.08|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-17 05:11:00|923.8 |-1      |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-17 05:24:00|922.23|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
|XAUUSD|2009-03-17 05:25:00|922.03|0       |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]|[-1]    |53544      |106773     |
+------+-------------------+------+--------+-----+------------------------------+--------+-----------+-----------+
only showing top 20 rows
keyValueCountPartialDFs
  .map( df =>
    df
      .withColumn("probability", divUDF($"keyValueObs", $"totalKeyObs"))
      .drop("keyValueObs", "totalKeyObs")
  )
  .zip(partialModelPaths).map{ case (df: DataFrame, path: String) =>
    df.write.mode("overwrite").format("delta").save(path)  
  }
val probDFs = partialModelPaths.map(spark.read.format("delta").load(_))
probDFs: scala.collection.immutable.IndexedSeq[org.apache.spark.sql.DataFrame] = Vector([ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields], [ticker: string, x: timestamp ... 6 more fields])
probDFs.last.orderBy("probability").show(20,false)
+------+-------------------+-------+--------+-----+---------------------------------+--------+--------------------+
|ticker|x                  |y      |reversal|trend|lagKey                           |lagValue|probability         |
+------+-------------------+-------+--------+-----+---------------------------------+--------+--------------------+
|BCOUSD|2011-04-29 11:59:00|125.61 |2       |1    |[0, -4, 0, 0, 0, 1, 0, 0, 0, -1] |[1]     |0.007142857142857143|
|XAUUSD|2015-07-31 00:05:00|1084.69|-1      |-1   |[-2, 0, 0, 0, 1, -1, 0, 0, 0, 3] |[-1]    |0.013513513513513514|
|XAUUSD|2016-09-07 11:09:00|1344.18|-2      |-1   |[0, 0, 0, 0, 6, 0, 0, 0, -1, 1]  |[-1]    |0.014492753623188406|
|XAUUSD|2015-11-06 00:05:00|1108.14|-1      |-1   |[0, 4, 0, 0, -1, 0, 0, 0, 0, 1]  |[-1]    |0.014705882352941176|
|XAUUSD|2010-05-13 01:54:00|1237.83|2       |1    |[-3, 0, 0, 1, 0, 0, 0, 0, 0, -1] |[1]     |0.01639344262295082 |
|BCOUSD|2011-04-05 16:30:00|121.63 |-3      |-1   |[-2, 0, 0, 0, 0, 1, 0, -1, 0, 3] |[-1]    |0.016666666666666666|
|XAUUSD|2009-09-30 22:16:00|1006.65|1       |1    |[0, 0, 0, 0, 1, 0, 0, -1, 1, -1] |[1]     |0.016666666666666666|
|XAUUSD|2010-11-05 04:43:00|1386.98|1       |1    |[0, 0, 0, 0, 1, 0, 0, -1, 1, -1] |[1]     |0.016666666666666666|
|XAUUSD|2010-11-19 15:13:00|1351.4 |-1      |-1   |[0, 0, 0, 0, 0, 1, -1, 0, 0, 5]  |[-1]    |0.017094017094017096|
|BCOUSD|2012-10-10 14:23:00|114.27 |-1      |-1   |[0, 0, 0, 0, 0, 1, -1, 0, 0, 5]  |[-1]    |0.017094017094017096|
|XAUUSD|2015-06-28 19:18:00|1185.37|2       |1    |[-4, 0, 0, 0, 1, 0, 0, 0, 0, -1] |[1]     |0.017241379310344827|
|BCOUSD|2015-10-15 01:18:00|49.75  |2       |1    |[0, -3, 0, 1, 0, -1, 0, 0, 1, -1]|[1]     |0.017543859649122806|
|XAUUSD|2012-02-27 20:23:00|1767.83|1       |1    |[0, -5, 0, 0, 0, 1, 0, 0, 0, -1] |[1]     |0.01818181818181818 |
|BCOUSD|2015-04-27 23:58:00|63.96  |-1      |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 8]   |[-1]    |0.0196078431372549  |
|XAUUSD|2009-04-08 15:54:00|880.88 |-1      |-1   |[0, 0, 0, 0, 0, 0, 0, 0, 0, 8]   |[-1]    |0.0196078431372549  |
|BCOUSD|2015-09-11 11:06:00|47.27  |-1      |-1   |[0, 0, 0, 0, 1, -1, 0, 0, 0, 6]  |[-1]    |0.0196078431372549  |
|XAUUSD|2012-01-24 23:36:00|1666.11|1       |1    |[0, 1, 0, 0, 0, -1, 0, 1, 0, -4] |[1]     |0.02040816326530612 |
|XAUUSD|2009-12-28 23:56:00|1104.68|-1      |-1   |[0, -2, 2, 0, 0, 0, 0, -1, 0, 1] |[-1]    |0.020833333333333332|
|BCOUSD|2014-03-12 09:33:00|107.74 |-3      |-1   |[0, 0, 7, 0, 0, 0, 0, -1, 0, 1]  |[-1]    |0.020833333333333332|
|XAUUSD|2016-04-06 09:50:00|1221.57|2       |1    |[0, -1, 0, 1, -3, 0, 0, 0, 1, -1]|[1]     |0.02127659574468085 |
+------+-------------------+-------+--------+-----+---------------------------------+--------+--------------------+
only showing top 20 rows

The prediction is given by taking

\[ V \in argmax(P_K(V)) \]

where *P_K(V)* is the probability that V is the next trend when the last m points have had reversals K. If there are more than one elements in argmax, an element is chosen uniformly at random.

val aggWindow = Window.partitionBy("lagKey").orderBy('probability desc)
val testedDFs = probDFs
  .map { df =>
     val predictionDF = df
      .select("lagKey", "lagValue", "probability")
      .distinct
      .withColumn("rank", rank().over(aggWindow))
      .filter("rank == 1")
      .groupBy("lagKey")
      .agg(collect_list("lagValue"))
    
    testingDF
      .filter($"ticker" === "BCOUSD")
      .join(predictionDF, Seq("lagKey"), "left")
      .withColumnRenamed("collect_list(lagValue)", "test")
  }
warning: there was one feature warning; for details, enable `:setting -feature' or `:replay -feature'
aggWindow: org.apache.spark.sql.expressions.WindowSpec = org.apache.spark.sql.expressions.WindowSpec@2dbd5c17
testedDFs: scala.collection.immutable.IndexedSeq[org.apache.spark.sql.DataFrame] = Vector([lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields], [lagKey: array<int>, ticker: string ... 7 more fields])

We use a binary loss function that indicates if the prediction was correct or not.

getRandomUDF is used to choose an element at random in argmax(P_K(V)). safeArr is used since we might see patterns in the test data that were not present in the training data.

val getRandomUDF = udf( (arr: Seq[Seq[Int]]) => {
  val safeArr = Option(arr).getOrElse(Seq[Seq[Int]]())
  if (safeArr.isEmpty) Seq[Int]() else safeArr(Random.nextInt(safeArr.size))
} )

val lossUDF = udf{ (value: Seq[Int], pred: Seq[Int]) =>
  if (value == pred) 0 else 1
}
getRandomUDF: org.apache.spark.sql.expressions.UserDefinedFunction = SparkUserDefinedFunction($Lambda$9416/1051216045@79bf6d73,ArrayType(IntegerType,false),List(Some(class[value[0]: array<array<int>>])),None,true,true)
lossUDF: org.apache.spark.sql.expressions.UserDefinedFunction = SparkUserDefinedFunction($Lambda$9417/777777096@6e1c4ce7,IntegerType,List(Some(class[value[0]: array<int>]), Some(class[value[0]: array<int>])),None,false,true)
val lossDFs = testedDFs.map(_.withColumn("prediction", getRandomUDF($"test")).withColumn("loss", lossUDF($"lagValue", $"prediction")))
lossDFs: scala.collection.immutable.IndexedSeq[org.apache.spark.sql.DataFrame] = Vector([lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields], [lagKey: array<int>, ticker: string ... 9 more fields])

Already on line 2 we see a case where lagKey was previously not observed (indicated by null in test column).

lossDFs.last.show(20,false)
+---------------------------------+------+-------------------+-----+--------+-----+--------+-----+------+----------+----+
|lagKey                           |ticker|x                  |y    |reversal|trend|lagValue|count|test  |prediction|loss|
+---------------------------------+------+-------------------+-----+--------+-----+--------+-----+------+----------+----+
|[0, 0, 0, 0, 0, 0, -2, 0, 0, 0]  |BCOUSD|2017-10-02 02:03:00|56.56|0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[-4, 0, 1, 0, 0, 0, -1, 0, 0, 0] |BCOUSD|2017-10-03 07:07:00|56.0 |1       |1    |[1]     |1    |[[-1]]|[-1]      |1   |
|[0, 1, 0, -1, 3, -1, 0, 1, 0, 0] |BCOUSD|2017-10-03 13:23:00|55.92|-2      |-1   |[-1]    |1    |[[1]] |[1]       |1   |
|[0, 2, 0, 0, -2, 0, 0, 0, 0, 1]  |BCOUSD|2017-10-13 06:59:00|57.43|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
|[0, 0, 0, 0, 1, -1, 2, 0, 0, -2] |BCOUSD|2017-10-18 07:21:00|58.21|0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[0, 0, 0, -1, 0, 0, 0, 2, 0, 0]  |BCOUSD|2017-10-25 06:08:00|58.21|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
|[0, -4, 1, 0, -1, 1, 0, -1, 0, 0]|BCOUSD|2017-10-31 08:44:00|60.52|0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[0, 0, -1, 0, 1, 0, -1, 0, 1, 0] |BCOUSD|2017-11-15 12:35:00|61.96|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
|[0, 0, 0, 0, 0, 0, 0, 0, -4, 0]  |BCOUSD|2017-11-15 13:28:00|62.02|0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[1, 0, -1, 0, 2, -2, 0, 0, 0, 0] |BCOUSD|2017-11-16 11:37:00|61.7 |0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[0, 0, 0, -2, 0, 0, 0, 2, 0, 0]  |BCOUSD|2017-12-08 09:10:00|63.37|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
|[0, 0, 0, -1, 0, 0, 2, 0, 0, 0]  |BCOUSD|2017-12-15 06:19:00|63.45|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
|[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]   |BCOUSD|2017-12-19 01:46:00|63.56|0       |1    |[1]     |1    |[[-1]]|[-1]      |1   |
|[1, 0, -1, 0, 0, 0, 0, 0, 3, 0]  |BCOUSD|2017-12-21 07:09:00|64.39|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
|[-1, 0, 0, 0, 0, 0, 0, 1, 0, -1] |BCOUSD|2017-12-22 13:50:00|65.18|0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[0, 1, 0, -1, 0, 0, 0, 0, 0, 1]  |BCOUSD|2017-12-27 14:51:00|65.92|-1      |-1   |[-1]    |1    |[[1]] |[1]       |1   |
|[0, 0, 0, 0, 1, 0, 0, -2, 1, -1] |BCOUSD|2018-01-03 22:15:00|67.91|0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[0, 0, 0, -3, 0, 0, 1, 0, -1, 0] |BCOUSD|2018-01-08 10:17:00|67.72|0       |-1   |[-1]    |1    |[[-1]]|[-1]      |0   |
|[-1, 0, 1, 0, 0, 0, -1, 0, 1, 0] |BCOUSD|2018-01-23 16:24:00|70.15|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
|[0, 0, 1, 0, -1, 0, 0, 2, 0, 0]  |BCOUSD|2018-01-30 07:53:00|69.02|0       |1    |[1]     |1    |[[1]] |[1]       |0   |
+---------------------------------+------+-------------------+-----+--------+-----+--------+-----+------+----------+----+
only showing top 20 rows
val testLength = testingDF.count
val oilTestDF = testingDF.filter($"ticker" === "BCOUSD")
val oilTestLength = oilTestDF.count
testLength: Long = 2053732
oilTestDF: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [ticker: string, x: timestamp ... 6 more fields]
oilTestLength: Long = 857750

Out of roughly 2 million observations, about 860k were oil price.

We find the mean loss for each training dataset of increasing size. As one can see, the loss decreases as more data is supplied.

Further, training on both oil and gold data yields a better result than just oil, suggesting that trends behave similarly in the two commodities.

val losses = lossDFs.map( _.agg(sum("loss")).select($"sum(loss)".as("totalLoss")).collect.head.getLong(0).toDouble/oilTestLength )
// Data up to 2019
// k=max, m=2 ,n=1: (0.5489615950080244, 0.4014721726541194, 0.3660973816818738, 0.36497087640146797, 0.36485163238050344)
// k=max, m=10,n=1: (0.9812730246821787, 0.8523390131056561, 0.5819573469954631, 0.3552177121357085, 0.2807503135425113)
// k=max,m=100,n=1: (1.0, 1.0, 1.0, 1.0, 1.0)
// k=5  , m=10,n=1: (0.9812730246821787, 0.8522267831239777, 0.5820597568537447, 0.3550507700379618, 0.2806352778112909)
// k=1  , m=10,n=1: (0.9812730246821787, 0.852200128503329, 0.5821242890932098, 0.3553383593660128, 0.2806549180580846)

// k=max, m=10,n=2: (0.9879380657977248, 0.9049354606556204, 0.7243136776273427, 0.5472986906951395, 0.4783823708897465)

// k=max(17), m=10,n=1, 10 training sets: (0.9977203284971564, 0.9812730246821787, 0.9455375956409875, 0.8523810993487855, 0.7183644724769999, 0.5819320952495854, 0.44607349380350214, 0.35513915114853356, 0.3029480010437388, 0.280629666312207)

// Data up to last month
// k=max(18), m=10,n=1, 10 training sets: (0.9973104051296998, 0.9843882250072865, 0.9510789857184494, 0.8574351501020111, 0.7515744680851064, 0.6360547945205479, 0.5099656076945497, 0.4249921305741766, 0.3735668901194987, 0.34609501603031184)
// Could the difference be due to Corona?

// Trained on both oil and gold. testing on oil as previously.
// k=max(18), m=10,n=1, 10 training sets: (0.9999778490236083, 0.9980728650539201, 0.9527158262897114, 0.8921317400174876, 0.7559988341591373, 0.5820915185077237, 0.46675488195861264, 0.3923101136694841, 0.3574876129408336, 0.3355837948120082)
losses: scala.collection.immutable.IndexedSeq[Double] = Vector(0.9999778490236083, 0.9980728650539201, 0.9528044301952784, 0.8921084232002332, 0.7558216263480035, 0.5819551151267852, 0.4666686097347712, 0.3920524628388225, 0.35745380355581463, 0.3355488195861265)

Visualizing the improvement of the model as the size of training data increases.

val trainingSizes = probDFs.map(_.count)
val lossesDS = sc
  .parallelize(losses.zip(trainingSizes))
  .toDF("loss", "size")
  .withColumn("training", lit("Oil and Gold"))
  .union(
    sc
      .parallelize(Seq(0.9973104051296998, 0.9843882250072865, 0.9510789857184494, 0.8574351501020111, 0.7515744680851064, 0.6360547945205479, 0.5099656076945497, 0.4249921305741766, 0.3735668901194987, 0.34609501603031184).zip(Seq(4, 18, 77, 331, 1414, 6036, 25759, 109918, 469036, 2001430)))
      .toDF("loss", "size")
      .withColumn("training", lit("Oil"))
  )
  .as[(Double,Long,String)]
trainingSizes: scala.collection.immutable.IndexedSeq[Long] = Vector(4, 21, 100, 470, 2189, 10193, 47464, 221012, 1029129, 4792058)
lossesDS: org.apache.spark.sql.Dataset[(Double, Long, String)] = [loss: double, size: bigint ... 1 more field]
display(lossesDS)
loss size training
0.9999778490236083 4.0 Oil and Gold
0.9980728650539201 21.0 Oil and Gold
0.9528044301952784 100.0 Oil and Gold
0.8921084232002332 470.0 Oil and Gold
0.7558216263480035 2189.0 Oil and Gold
0.5819551151267852 10193.0 Oil and Gold
0.4666686097347712 47464.0 Oil and Gold
0.3920524628388225 221012.0 Oil and Gold
0.35745380355581463 1029129.0 Oil and Gold
0.3355488195861265 4792058.0 Oil and Gold
0.9973104051296998 4.0 Oil
0.9843882250072865 18.0 Oil
0.9510789857184494 77.0 Oil
0.8574351501020111 331.0 Oil
0.7515744680851064 1414.0 Oil
0.6360547945205479 6036.0 Oil
0.5099656076945497 25759.0 Oil
0.4249921305741766 109918.0 Oil
0.3735668901194987 469036.0 Oil
0.34609501603031184 2001430.0 Oil

trendcalculusmcmodelperformance

The series using only oil data is very similar to the one with both oil and gold data. The final data point has a loss of about 0.35, meaning that the model accuracy is around 65% for predicting the the trend of the next point in the time series.

To better understand the difference between the different (partially trained) models, we want to calculate the total variation distance between them.

The first step is to collect the models and work with them locally.

val partialModels = probDFs.map{df => 
  val tmpMap = df
    .select("lagKey", "lagValue", "probability")
    .distinct
    .collect
    .map{ r => 
      (r.getAs[Seq[Int]](0), r.getAs[Seq[Int]](1), r.getDouble(2))
    }
    .groupBy(_._1) // Grouping by lagKey
    .mapValues(_.map(tup => Map(tup._2 -> tup._3)).flatten.toMap)
  
  tmpMap: FinalModel
}

If a lagKey sequence only belongs to one of the models we are comparing, the variation distance is maximal, i.e. 1. Otherwise, we compute the total variation distance between the probabilities of observing a sequence of n trends, following the sequence of trends lagKey.

def totalVarDist(m1: FinalModel, m2: FinalModel): Map[Seq[Int], Double] = {
  val allKeys = m1.keys.toSet.union(m2.keys.toSet)
  val sharedKeys = m1.keys.toSet.intersect(m2.keys.toSet)
  val totalVarDists = allKeys.toSeq.map{ key =>
    if (!sharedKeys.contains(key)) {
      1.0
    } else {
      val val1 = m1.getOrElse(key, Map())
      val val2 = m2.getOrElse(key, Map())
      val allValKeys = val1.keys.toSet.union(val2.keys.toSet)
      allValKeys.map( valKey => 0.5*math.abs(val1.getOrElse(valKey, 0.0) - val2.getOrElse(valKey, 0.0)) ).sum
    }
  }
  allKeys.zip(totalVarDists).toMap
}
totalVarDist: (m1: FinalModel, m2: FinalModel)Map[Seq[Int],Double]

Computing the variation distance for all possible pairs of partially trained models.

val totalVariationDistances = partialModels.map( m1 => partialModels.map( m2 => totalVarDist(m1,m2) ) )
totalVariationDistances: scala.collection.immutable.IndexedSeq[scala.collection.immutable.IndexedSeq[Map[Seq[Int],Double]]] = Vector(Vector(Map(WrappedArray(0, 1, -1, 0, 0, 0, 0, 0, 4, -2) -> 0.0, WrappedArray(-6, 0, 0, 1, -1, 0, 0, 0, 0, 0) -> 0.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 0, 4) -> 0.0, WrappedArray(1, -1, 0, 0, 0, 0, 0, 4, -2, 0) -> 0.0), Map(WrappedArray(-6, 0, 0, 1, -1, 0, 0, 0, 0, 0) -> 0.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, -4, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 0, 4, -2) -> 0.0, WrappedArray(-1, 0, 2, 0, 0, 0, 0, 0, 0, -4) -> 1.0, WrappedArray(0, 4, -2, 0, 1, 0, 0, -1, 0, 2) -> 1.0, WrappedArray(0, 0, 0, 4, -2, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(-2, 0, 1, 0, 0, -1, 0, 2, 0, 0) -> 1.0, WrappedArray(4, -2, 0, 1, 0, 0, -1, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 0, 4) -> 0.0, WrappedArray(1, -1, 0, 0, 0, 0, 0, 4, -2, 0) -> 0.0, WrappedArray(0, 1, 0, 0, -1, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, 0, -4, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -4, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -4, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 4, -2, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 4, -2, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 4, -2, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 4, -2, 0, 1, 0, 0) -> 1.0), Map(WrappedArray(0, 0, 0, 0, 0, -2, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-6, 0, 0, 1, -1, 0, 0, 0, 0, 0) -> 0.0, WrappedArray(0, 0, 0, 0, 0, 0, 1, -1, 3, 0) -> 1.0, WrappedArray(0, 1, -1, 3, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, -4, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 0, 4, -2) -> 0.0, WrappedArray(1, 0, -3, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -4, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 2, 0, 0, 0, 0, 0, 0, -4) -> 1.0, WrappedArray(0, 4, -2, 0, 1, 0, 0, -1, 0, 2) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(0, 0, 0, 4, -2, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(-1, 3, 0, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 1, -1, 0, 2, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 1, 0, -3, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -2, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(-2, 0, 1, 0, 0, -1, 0, 2, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, -3, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -4, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 3, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, 0, 0, -1, 0, 1, 0, -3) -> 1.0, WrappedArray(0, 1, 0, 0, -1, 0, 1, 0, -3, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 3, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 1, -1, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 1, -1, 0, 2) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(4, -2, 0, 1, 0, 0, -1, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 0, 4) -> 0.0, WrappedArray(0, 0, -1, 0, 0, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 0, 4, -2, 0) -> 0.0, WrappedArray(2, 0, 0, 0, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 3, 0, 0, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 1, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 1, 0, 0, -1, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 1, 0, -3, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 2, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(-3, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, 0, -4, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 1, -1, 0, 2, 0) -> 1.0, WrappedArray(1, -1, 3, 0, 0, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(0, 0, 0, 0, 2, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, -1, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -2, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -4, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -4, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, -3, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 0, 0, 1, -1, 3) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -4, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 2, 0, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 4, -2, 0, 1, 0) -> 1.0, WrappedArray(0, -3, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, -2, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 4, -2, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(-4, 0, 0, 0, 0, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(0, 2, 0, 0, 0, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 3, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 2, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 2, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -3, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 4, -2, 0, 1) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 0, -4, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, 0, -2, 0, 0) -> 1.0, WrappedArray(3, 0, 0, 0, 0, 0, 0, -2, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 4, -2, 0, 1, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 0, 0, 0, 1, -1) -> 1.0), Map(WrappedArray(0, 0, 0, 0, 0, -2, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 2, 0, -1, 0, 1, 0, -3) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 0, 6, -2) -> 1.0, WrappedArray(0, 0, -3, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 1, -1, 1, -1, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 2, 0, -1, 0) -> 1.0, WrappedArray(1, -1, 0, 1, -1, 0, 0, 3, -1, 0) -> 1.0, WrappedArray(0, 0, 4, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, -2, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(-6, 0, 0, 1, -1, 0, 0, 0, 0, 0) -> 0.0, WrappedArray(0, 0, -4, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 5, 0, 0, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 0, 0, 1, 0, -1) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 5, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -4, 0, 1, 0, -1, 0, 1) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -3, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 1, -1, 0, 0, 3, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 1, -1, 0, 0) -> 1.0, WrappedArray(6, -2, 0, 0, 0, 2, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 1, -1, 3, 0) -> 1.0, WrappedArray(0, 0, 2, 0, -1, 0, 1, -1, 0, 1) -> 1.0, WrappedArray(0, -2, 0, 0, 1, -1, 0, 1, -1, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(-5, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, 0, -5, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 3, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, -1, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 2, -2, 3, -1, 1) -> 1.0, WrappedArray(1, 0, -1, 0, 1, 0, 0, -1, 0, 2) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, 0, 0, -1, 1) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, -4, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 1, 0, -2, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 5, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(-3, 0, 1, 0, -1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 3, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 0, 4, -2) -> 0.0, WrappedArray(-1, 1, -2, 0, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 1, 0, -2, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 4, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, -1, 0, 2, -2, 3) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, -2, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 1, 0, -2, 0, 2, 0, -1) -> 1.0, WrappedArray(-2, 3, -1, 1, 0, 0, -3, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 0, 2, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(1, 0, -3, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(2, 0, -1, 0, 1, 0, -2, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, -1, 1, -2) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 0, 0, 2, -2) -> 1.0, WrappedArray(3, -1, 1, 0, 0, -3, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -4, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 2, 0, 0, 0, 0, 0, 0, -4) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 1, -1, 0, 0, 3) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, -4, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, -1, 1, -2, 0, 0) -> 1.0, WrappedArray(0, 4, -2, 0, 1, 0, 0, -1, 0, 2) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -4, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 1, -1, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(0, 0, 0, 4, -2, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(1, 0, -2, 0, 2, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(3, 0, -1, 1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 1, 0, -2) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 3, 0, -1, 1) -> 1.0, WrappedArray(0, 0, -4, 0, 1, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 0, 4) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 1, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 3, -1, 0, 1, 0, 0, -3, 1) -> 1.0, WrappedArray(0, -3, 0, 1, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 3, 0, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 4, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 1, -1, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 6, -2, 0, 0, 0, 2) -> 1.0, WrappedArray(-1, 0, 1, 0, -2, 0, 2, 0, -1, 0) -> 1.0, WrappedArray(0, 6, -2, 0, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 0, 0, 3, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, 0, 0, 0, -4) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -2, 0, 0, 0) -> 1.0, WrappedArray(0, 2, -2, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -5, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, -1, 0, 1, 0, -1) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, -4, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 3, 0, -1, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 0, 1, 0, -2) -> 1.0, WrappedArray(0, -2, 0, 0, 1, -1, 0, 2, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 1, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 1, 0, -3, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 6, -2, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, -4, 0, 1, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(1, 0, 0, -3, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 6, -2, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 2, 0, -1, 0, 1, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 1, 0, -1, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, -2, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(1, 0, 0, -3, 1, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 3, 0, -1) -> 1.0, WrappedArray(-2, 0, 2, 0, -1, 0, 1, 0, -3, 1) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 1, 0, -3, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, -3, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, -4) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -2, 0, 0, 1) -> 1.0, WrappedArray(0, -1, 1, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(-2, 0, 1, 0, 0, -1, 0, 2, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, -3, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 1, 0, -2, 1, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 2, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, -1, 1, -2, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(-1, 0, 0, 0, 1, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -4, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, -2, 0, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, 0, 0, -5, 0) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 3, 0, 0, 0, 0) -> 1.0, WrappedArray(-4, 0, 0, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, -1, 0, 2, -2) -> 1.0, WrappedArray(-4, 0, 1, 0, -1, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 2, 0, -1, 0, 1, 0, -3, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -4, 0, 1, 0, -1) -> 1.0, WrappedArray(-2, 0, 0, 1, -1, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -2, 1, 0, -1, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, -3, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 2, -2, 0, 1, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, -2, 1, 0, -1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -5, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 2, 0, -1, 0, 1, 0, -2) -> 1.0, WrappedArray(0, 1, 0, -2, 0, 2, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(2, -2, 0, 1, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 1, 0, 0, -1, 0, 1, 0, -3) -> 1.0, WrappedArray(0, 0, 2, -2, 0, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 1, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 1, 0, 0, -1, 0, 1, 0, -3, 0) -> 1.0, WrappedArray(3, -1, 0, 1, 0, 0, -3, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 3, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 1, -1, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 0, 0, 3) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 5, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 2, 0, -1) -> 1.0, WrappedArray(0, 4, 0, 0, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 3, -1, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, -3, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 4, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, 0, -4, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, -4, 0, 0, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(-2, 0, 2, 0, -1, 0, 1, 0, -2, 0) -> 1.0, WrappedArray(-1, 0, 0, 1, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, 0, 0, -4, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(-1, 0, 0, 3, -1, 0, 1, 0, 0, -3) -> 1.0, WrappedArray(0, 0, 0, 4, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(4, 0, 0, 0, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 2, -2, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 1, -1, 0, 2) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(-1, 0, 0, 2, 0, -1, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(4, -2, 0, 1, 0, 0, -1, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 0, 4) -> 0.0, WrappedArray(0, 3, 0, -1, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, -1, 1, -2, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(2, -2, 3, -1, 1, 0, 0, -3, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 5, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(-1, 0, 2, -2, 3, -1, 1, 0, 0, -3) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 3, 0, -1, 1, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -1, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 0, 4, -2, 0) -> 0.0, WrappedArray(-1, 0, 0, 0, 1, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 0, 0, 1, 0) -> 1.0, WrappedArray(2, 0, 0, 0, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, -1, 0, 0, 2) -> 1.0, WrappedArray(0, -1, 0, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 3, 0, 0, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(-2, 1, 0, -1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 1, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 1, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 1, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 5, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 6, -2, 0, 0) -> 1.0, WrappedArray(1, 0, -2, 1, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -5, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 0, 0, 0, 0, 6) -> 1.0, WrappedArray(0, 1, 0, 0, -1, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, -3, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(-1, 1, -1, 0, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 1, 0, -3, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 2, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, -1, 1, -2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-3, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 0, 2, -2, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -5, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 1, -2, 0, 0, 0, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, 0, -4, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, -5, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 1, -1, 0, 1, -1) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 1, 0, -2, 0, 2) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 5, 0) -> 1.0, WrappedArray(0, 0, 6, -2, 0, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 4, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(-3, 1, 0, 0, -1, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 2, 0, -1, 0, 1, 0, -2, 0, 2) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 1, -1, 0, 2, 0) -> 1.0, WrappedArray(1, -1, 3, 0, 0, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(0, -3, 1, 0, 0, -1, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 3, -1, 0, 1, 0, 0, -3, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 2, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(-1, 1, 0, 0, -3, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(-1, 1, 0, 0, 0, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -3, 0, 1, 0, -1) -> 1.0, WrappedArray(2, 0, 0, 0, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 4, 0) -> 1.0, WrappedArray(0, -3, 0, 0, 0, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -5, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 4, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, -4, 0, 1) -> 1.0, WrappedArray(0, 0, -1, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 2, 0, -1, 0, 1, -1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 2, -2, 3, -1, 1, 0, 0, -3, 0) -> 1.0, WrappedArray(-2, 0, 0, 1, -1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, 0, 0, 0, -5) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 1, 0, -2, 1) -> 1.0, WrappedArray(0, 0, -3, 1, 0, 0, -1, 0, 0, 2) -> 1.0, WrappedArray(1, 0, 0, 0, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 2, -2, 3, -1, 1, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -4, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, -1, 0, 2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, -1, 0, 0, 1) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(-3, 0, 0, 0, 0, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -4, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 0, 1, -1, 1) -> 1.0, WrappedArray(0, 0, 0, 0, -4, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, -3, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 0, 0, 1, -1, 3) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -4, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 2, 0, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 4, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, -1, 0, 2, -2, 3, -1) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 0, 5) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, -4, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 5, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 2, -2, 0, 1) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 4, -2, 0, 1, 0) -> 1.0, WrappedArray(0, -3, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, -2, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 1, 0, -2, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 1, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 4, -2, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 1, -1, 0, 0, 3, -1) -> 1.0, WrappedArray(0, -1, 0, 1, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 1, -1, 1, -1) -> 1.0, WrappedArray(0, 0, 1, 0, -1, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(-4, 0, 0, 0, 0, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(0, 2, 0, 0, 0, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 1, -1, 0, 1) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 1, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 3, 0, -1, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 3, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 2, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(1, 0, -3, 0, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 1, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, -1, 0, 2, -2, 3, -1, 1, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 2, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, 0, -2, 0, 2, 0, -1, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -3, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, -1, 0, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 4, -2, 0, 1) -> 1.0, WrappedArray(2, 0, -1, 0, 1, 0, -3, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 1, -2, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 0, 6, -2, 0) -> 1.0, WrappedArray(0, 0, -4, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 4, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, -2, 1, 0, -1) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -2, 0, 2, 0, -1) -> 1.0, WrappedArray(0, 2, 0, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, 0, -2, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 3, 0, -1, 1, 0, 0, 0) -> 1.0, WrappedArray(3, 0, 0, 0, 0, 0, 0, -2, 0, 0) -> 1.0, WrappedArray(2, 0, -1, 0, 1, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, 0, -3, 0, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 2, -2, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(5, 0, 0, 0, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(4, 0, 0, 0, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 4, -2, 0, 1, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 0, 0, 0, 1, -1) -> 1.0), Map(WrappedArray(-1, 0, 2, -1, 0, 0, 0, 1, -3, 3) -> 1.0, WrappedArray(-3, 0, 0, 0, 1, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 1, 0, 0, 0, -3) -> 1.0, WrappedArray(2, 0, 0, 0, -1, 0, 1, 0, 0, -4) -> 1.0, WrappedArray(-1, 0, 0, 0, 1, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 2, -2, 0, 1, -1, 0, 0, 1) -> 1.0, WrappedArray(-1, 1, 0, -2, 0, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 3, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -2, 0, 0, 1, -1) -> 1.0, WrappedArray(1, 0, 0, 0, -1, 1, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 5, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(0, -2, 0, 1, 0, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 1, 0, -2) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 3, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 2, 0, -1, 0, 1, 0, -3) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 0, 6, -2) -> 1.0, WrappedArray(0, 0, -3, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 1, 0, -1, 0) -> 1.0, WrappedArray(2, -2, 0, 6, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(1, 0, -2, 0, 1, 0, -1, 0, 0, 2) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, 0, 0, 0, -6) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 0, 0, 0, 0, 3) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 1, -1, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 2, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 3, 0, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(-4, 0, 0, 0, 0, 2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 1, 0, -1, 0, 0, 2, 0) -> 1.0, WrappedArray(0, -1, 2, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, -3, 0, 0, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 1, -1, 0, 0, 3, -1, 0) -> 1.0, WrappedArray(0, 0, 4, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -3, 0, 1, 0, -1, 0, 2, -2) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 3, -1, 0, 1, -1, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, -2, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(-2, 0, 1, -1, 0, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(2, 0, 0, 0, -1, 0, 1, 0, 0, -2) -> 1.0, WrappedArray(1, 0, -2, 0, 0, 0, 0, 0, 3, -1) -> 1.0, WrappedArray(0, 0, 0, 2, 0, -1, 1, 0, 0, -2) -> 1.0, WrappedArray(0, 0, 0, -2, 2, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, -2, 1, 0, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 2, 0, -1, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 2, -2) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 1, 0, 0, -1, 1) -> 1.0, WrappedArray(0, -3, 0, 0, 2, -2, 0, 1, 0, -1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -2, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 1, 0, -1, 1) -> 1.0, WrappedArray(1, -1, 0, 0, 2, -2, 1, 0, -1, 0) -> 1.0, WrappedArray(-1, 0, 0, 1, 0, -2, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, -4, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 0, 3) -> 1.0, WrappedArray(0, 0, 3, -1, 1, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(-1, 0, 0, 2, 0, 0, 0, -2, 0, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, 0, -1, 1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -1, 1, -2, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 1, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, -3, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(-5, 0, 0, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 6, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, -1, 0, 5, -1, 0) -> 1.0, WrappedArray(0, -1, 1, 0, -1, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(4, 0, 0, 0, 0, 0, 0, 0, -1, 1) -> 1.0, WrappedArray(0, 0, -1, 2, 0, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(-6, 0, 0, 1, -1, 0, 0, 0, 0, 0) -> 0.0, WrappedArray(0, 0, 0, 0, 0, 0, -2, 2, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, -4, 0, 1, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, -4, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 2, 0, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(-1, 1, 0, 0, 0, 0, -2, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 5, 0, 0, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(-1, 1, 0, 0, -1, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, -1, 5, -1, 1) -> 1.0, WrappedArray(1, 0, -3, 0, 4, -2, 2, -1, 0, 0) -> 1.0, WrappedArray(0, 6, 0, 0, 0, -1, 0, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 1, -2, 0, 1) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 1, 0, 0, 0, -1) -> 1.0, WrappedArray(-1, 0, 2, 0, 0, 0, -3, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 5, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 0, 0, 1, 0, -1) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, 5, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 5, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, -1, 0, 1, -3, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 1, 0, -1, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -4, 0, 1, 0, -1, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, 0, 0, 0, -5) -> 1.0, WrappedArray(0, -1, 0, 1, 0, -3, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, -1, 1) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 1, 0, -1, 1, 0) -> 1.0, WrappedArray(0, 6, 0, 0, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 3, 0, -2, 2, 0, -1, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 0, 0, 0, 2) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(-3, 1, 0, 0, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 1, -1, 0, 0, 3, -1, 0, 1) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 2, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, -1, 1, -1, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 0, 3, -1, 0, 1) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(6, -2, 0, 0, 0, 2, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 2, 0, 0, -3, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 1, -1, 3, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 3, 0, 0, -3, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -4, 0, 1, 0, 0) -> 1.0, WrappedArray(-1, 1, 0, 0, -2, 0, 2, 0, -1, 0) -> 1.0, WrappedArray(-1, 0, 0, 1, -1, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(0, 0, 2, 0, -1, 0, 1, -1, 0, 1) -> 1.0, WrappedArray(-1, 0, 1, 0, -2, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 1, -1, 0, 1, -1, 0) -> 1.0, WrappedArray(-4, 0, 1, 0, 0, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 2, 0, -2, 1, 0) -> 1.0, WrappedArray(2, 0, -1, 0, 0, 1, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, -4, 0, 0, 4, 0, -2) -> 1.0, WrappedArray(0, 1, -1, 0, 1, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 2, 0, 0, 0, -2) -> 1.0, WrappedArray(-5, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 1, -1, 0, 1, -1, 0) -> 1.0, WrappedArray(-5, 0, 0, 0, 0, 1, -1, 0, 0, 4) -> 1.0, WrappedArray(-1, 2, 0, 0, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(-3, 0, 0, 2, -2, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(-3, 0, 0, 2, -2, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(0, 1, -3, 0, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, 0, -5, 0, 0) -> 1.0, WrappedArray(0, 1, 0, -4, 0, 2, -1, 1, 0, -2) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -2, 0, 1, -1, 2, 0, -1) -> 1.0, WrappedArray(0, 1, -1, 3, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, -1, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, -5, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -5, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 3, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 4, 0, 0, -3) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -4, 0, 0, 4, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 5, 0) -> 1.0, WrappedArray(1, 0, 0, -1, 0, 2, -2, 3, -1, 1) -> 1.0, WrappedArray(0, -1, 0, 1, -1, 0, 1, -3, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 2, 0, 0, -4, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 1, 0, -3, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 6, 0, 0, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 1, 0, 0, -1, 0, 2) -> 1.0, WrappedArray(0, -2, 0, 0, 1, -1, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, 0, 0, -1, 1) -> 1.0, WrappedArray(0, 0, 1, 0, 0, -1, 0, 2, 0, 0) -> 1.0, WrappedArray(0, 2, -1, 0, 0, 0, 1, -3, 3, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 2, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, -4, 0) -> 1.0, WrappedArray(-2, 2, 0, -1, 0, 1, -1, 0, 1, -3) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 1, 0, -2, 0) -> 1.0, WrappedArray(0, 0, 1, 0, 0, -1, 0, 4, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 5, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 1, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 2, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 2, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 1, -2, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, -2, 0, 1, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, -1, 0, 1, -2) -> 1.0, WrappedArray(0, 0, 0, 4, 0, 0, -3, 1, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(5, -1, 1, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -6, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, -2, 0, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 7, 0, 0, -1, 1, 0) -> 1.0, WrappedArray(3, 0, 0, -2, 0, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 1, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(-3, 0, 1, 0, -1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(-1, 1, 0, -1, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 2, 0, -1, 0, 1) -> 1.0, WrappedArray(1, -1, 0, 0, 3, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 0, 4, -2) -> 0.0, WrappedArray(0, 0, 0, 0, 1, 0, -1, 0, 2, 0) -> 1.0, WrappedArray(-1, 1, -2, 0, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 4, 0, 0, -2, 0, 0, 2, -1, 0) -> 1.0, WrappedArray(1, 0, -2, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 1, 0, -1, 0, 0, 2, 0) -> 1.0, WrappedArray(0, 1, 0, -2, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 7, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 0, 4, 0, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, -1, 0, 2, -2, 3) -> 1.0, WrappedArray(0, -1, 1, -2, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, 0, 0, 0, -1) -> 1.0, WrappedArray(-1, 0, 0, 0, 3, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, -3, 0, 3, 0, -2) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, -2, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 1, 0, -2, 0, 2, 0, -1) -> 1.0, WrappedArray(0, 0, -1, 1, -2, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, 0, -4, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, -1, 0, 2, 0) -> 1.0, WrappedArray(0, 2, 0, -1, 1, 0, 0, -2, 0, 2) -> 1.0, WrappedArray(-2, 3, -1, 1, 0, 0, -3, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 1, -1, 0, 0, 2, 0) -> 1.0, WrappedArray(-2, 1, 0, 0, -1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -5, 0, 0, 0, 0, 0, 2) -> 1.0, WrappedArray(1, 0, -2, 0, 0, 1, -1, 2, 0, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 2, -1, 1, 0, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 5, -1, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 0, 0, 1, 0, -1) -> 1.0, WrappedArray(-1, 0, 1, 0, -2, 0, 0, 3, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 0, 0, 1) -> 1.0, WrappedArray(-2, 0, 0, 0, 2, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(1, 0, -3, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 0, 0, 3, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -2, 2, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(2, 0, -1, 0, 1, 0, -2, 0, 2, 0) -> 1.0, WrappedArray(-5, 0, 0, 0, 0, 0, 2, 0, -2, 1) -> 1.0, WrappedArray(1, 0, 0, -1, 2, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(2, -1, 0, 0, 0, 1, 0, -2, 0, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, -1, 1, -2) -> 1.0, WrappedArray(0, 0, 2, -2, 0, 1, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 3, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -3, 1, 0, 0, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 1, 0, -1, 0, 1, -1, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(-1, 0, 2, 0, 0, -3, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 1, 0, 0, 0, -1, 0, 2) -> 1.0, WrappedArray(1, 0, -2, 1, 0, -1, 0, 0, 0, 3) -> 1.0, WrappedArray(2, -1, 0, 1, 0, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -1, 0, 0, 0, 1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 0, 0, 2, -2) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, -4, 0, 1, -1, 0) -> 1.0, WrappedArray(-1, 0, 0, 1, 0, -1, 0, 0, 0, 2) -> 1.0, WrappedArray(3, -1, 1, 0, 0, -3, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 4, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -1, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 1, 0, 0, -4, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 4, 0, 0, -2, 1, -1, 0, 2, -1) -> 1.0, WrappedArray(0, 0, -3, 0, 0, 2, -2, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 3, 0, 0, 0, 0, -1) -> 1.0, WrappedArray(-1, 0, 3, 0, 0, 0, -2, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, -4, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 2, 0, 0, 0, 0, 0, 0, -4) -> 1.0, WrappedArray(2, 0, -1, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 1, -6, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 1, -1, 0, 0, 3) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -2, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 3, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 1, -1, 0, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 2, -2, 1, 0, -1, 0, 0, 3, -1) -> 1.0, WrappedArray(0, 1, -3, 0, 0, 0, 0, 0, 3, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 0, 1, 0, 0, -4) -> 1.0, WrappedArray(0, 2, -2, 0, 1, -1, 0, 0, 1, 0) -> 1.0, WrappedArray(1, 0, -2, 0, 1, 0, -1, 1, 0, -1) -> 1.0, WrappedArray(-1, 0, 3, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, -1, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 1, 0, -2, 0, 0, 1, -1, 2) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, -4, 0) -> 1.0, WrappedArray(0, 0, 0, 2, 0, -1, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 1, 0, 0, -2) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 2, 0, 0, -3, 0) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, -1, 1, -2, 0, 0) -> 1.0, WrappedArray(2, 0, -1, 0, 1, -1, 0, 1, 0, -2) -> 1.0, WrappedArray(0, 0, 1, 0, -1, 0, 3, 0, 0, 0) -> 1.0, WrappedArray(0, 4, -2, 0, 1, 0, 0, -1, 0, 2) -> 1.0, WrappedArray(0, 1, 0, -2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(2, 0, 0, 0, 0, -2, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 2, -2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 2, 0, -2, 1, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 1, -2, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 0, 4, 0, 0) -> 1.0, WrappedArray(6, 0, 0, 0, -1, 0, 0, 1, 0, -2) -> 1.0, WrappedArray(0, 0, 0, -5, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 4, 0, 0, -2, 0) -> 1.0, WrappedArray(-1, 0, 1, -1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, -2, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(-3, 0, 0, 0, 0, 0, 0, 1, 0, -1) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -4, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 0, 2, 0) -> 1.0, WrappedArray(0, 0, -1, 0, 0, 2, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 1, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 3, 0, 0, -2, 2, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, -4, 0, 0, 4) -> 1.0, WrappedArray(-1, 1, 0, 0, 0, -1, 1, 0, -1, 0) -> 1.0, WrappedArray(0, 0, -2, 0, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, -3, 0, 0, 2) -> 1.0, WrappedArray(0, -1, 0, 1, 0, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, -2, 0, 0, 2) -> 1.0, WrappedArray(-2, 0, 0, 1, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 0, 0, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 0, 4, -2, 0, 1, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 2, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 1, -1, 2, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(1, 0, 0, 0, 0, -2, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, -1, 0, 0, 2, -1) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 0, 0, 1, 0, -1) -> 1.0, WrappedArray(1, 0, -2, 0, 2, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, 0, -2, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, 3, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 2, -2, 0, 6, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 0, 0, 2, 0, 0) -> 1.0, WrappedArray(-2, 1, 0, -1, 0, 0, 0, 3, 0, -1) -> 1.0, WrappedArray(0, 1, -1, 0, 0, 0, 3, 0, 0, -2) -> 1.0, WrappedArray(-1, 1, 0, 0, 0, 0, -1, 1, -2, 1) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 5, 0, -1, 1, -2, 0, 0, 2) -> 1.0, WrappedArray(0, 0, 0, -2, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 7, 0, 0) -> 1.0, WrappedArray(3, 0, -1, 1, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, 0, 0, 0, -5, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, 0, -1, 1, 0, 0) -> 1.0, WrappedArray(0, 0, -2, 2, 0, 0, -1, 1, 0, 0) -> 1.0, WrappedArray(-2, 0, 0, 0, 0, 0, 0, 0, 0, 3) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -6, 0, 0, 0, 0, 0, 4) -> 1.0, WrappedArray(1, 0, -1, 0, 1, 0, -1, 0, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 1, -1, 0, 2, 0) -> 1.0, WrappedArray(0, -2, 0, 1, -1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 1, 0, 0, -4, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 0, 1, -1, 0, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, -1, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 3, -1, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 1, 0, -2) -> 1.0, WrappedArray(0, 1, -1, 0, 1, 0, -2, 1, 0, -1) -> 1.0, WrappedArray(0, 0, 1, 0, 0, 0, 0, -1, 0, 1) -> 1.0, WrappedArray(0, -1, 1, 0, -2, 0, 1, 0, -1, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 1, 0, -1, 0, 0, 2) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 1, 0, -2, 0, 1) -> 1.0, WrappedArray(0, -4, 0, 0, 0, 0, 2, 0, 0, 0) -> 1.0, WrappedArray(0, 2, 0, 0, 0, 0, -2, 1, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 3, 0, 0, -2, 0, 0, 1) -> 1.0, WrappedArray(0, 0, 2, 0, -1, 1, 0, 0, -2, 0) -> 1.0, WrappedArray(-1, 0, 1, 0, 0, -2, 0, 0, 0, 1) -> 1.0, WrappedArray(0, -1, 1, -6, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, -3, 0, 0, 0, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 3, 0, -1, 1) -> 1.0, WrappedArray(0, 0, 0, 1, -2, 0, 1, -1, 2, 0) -> 1.0, WrappedArray(0, 0, -4, 0, 1, 0, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 1, 0, 0, 0, -1, 0, 1, 0, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 1, -1, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(0, 0, 0, 1, -1, 0, 0, 0, 0, 4) -> 1.0, WrappedArray(1, 0, -1, 0, 1, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(-1, 0, 0, 0, 1, -1, 0, 2, -2, 0) -> 1.0, WrappedArray(-1, 0, 0, 2, -1, 0, 0, 0, 1, 0) -> 1.0, WrappedArray(1, 0, -1, 0, 0, 0, 2, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 1, 0, -2, 0, 1, 0) -> 1.0, WrappedArray(0, -1, 0, 0, 0, 1, 0, 0, 0, -1) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 1, -2, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, -1, 0, 0, 2, -1) -> 1.0, WrappedArray(0, 0, 0, 0, 2, 0, 0, 0, 0, -2) -> 1.0, WrappedArray(-1, 4, 0, -1, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 2, 0, -1, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 3, 0, 0, -2, 0) -> 1.0, WrappedArray(0, 1, 0, -1, 0, 0, 1, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 1, 0, -1, 0, 3, 0) -> 1.0, WrappedArray(2, 0, -1, 1, 0, 0, 0, -1, 0, 0) -> 1.0, WrappedArray(0, -2, 0, 0, 0, 0, 0, 3, -1, 0) -> 1.0, WrappedArray(0, 0, 1, 0, -1, 0, 0, 0, 1, -1) -> 1.0, WrappedArray(-1, 0, 1, -1, 0, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -2, 2, -1, 0, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, -1, 0, 0, 1, 0, 0) -> 1.0, WrappedArray(-4, 0, 0, 1, 0, 0, 0, 0, -1, 0) -> 1.0, WrappedArray(0, 0, 0, 1, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 3, -1, 0, 1, 0, 0, -3, 1) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 1, -1, 4) -> 1.0, WrappedArray(0, -6, 0, 0, 0, 0, 0, 4, 0, 0) -> 1.0, WrappedArray(0, -3, 0, 1, 0, -1, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, -1, 0, 0, 2, -1, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 0, 0, -5) -> 1.0, WrappedArray(-1, 3, 0, 0, 0, 0, 0, 0, -2, 0) -> 1.0, WrappedArray(0, 0, 1, -1, 0, 2, 0, 0, 0, -3) -> 1.0, WrappedArray(0, 0, 0, 1, 0, -1, 0, 0, 1, 0) -> 1.0, WrappedArray(-1, 0, 0, 4, 0, 0, -2, 1, -1, 0) -> 1.0, WrappedArray(0, 0, 2, 0, 0, 0, 0, -2, 1, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, -5, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 3, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 0, 0, 0, 0, 4, 0, 0, 0) -> 1.0, WrappedArray(1, -1, 1, -1, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(0, 5, 0, -1, 1, -2, 0, 0, 2, 0) -> 1.0, WrappedArray(0, -1, 0, 1, -2, 0, 0, 0, 0, 0) -> 1.0, WrappedArray(0, 0, 0, 0, 0, 0, 0, 1, -1, 0) -> 1.0, WrappedArray(-8, 0, 0, 0, 0, 0, 0, 0, 0, 0) -> ...
def aggToMatrix(totalVarDists: Seq[Seq[Map[Seq[Int],Double]]], aggFunc: Seq[Double] => Double): Seq[Seq[Double]] = {
  totalVariationDistances.map(_.map( t => aggFunc(t.values.toSeq)))
}

def printMatrix(mat: Seq[Seq[Double]]): Unit = {
  mat.map( s => { s.map( a => print(f"$a%2.3f ") ); println() } )
}
aggToMatrix: (totalVarDists: Seq[Seq[Map[Seq[Int],Double]]], aggFunc: Seq[Double] => Double)Seq[Seq[Double]]
printMatrix: (mat: Seq[Seq[Double]])Unit

We compute three different metrics of the total variation distances.

  • maxDists is the maximal total variation distance over the different lagKeys.
  • minDists is the minimal total variation distance over the different lagKeys.
  • meanDists is an estimate of the average total variation distance over the different lagKeys.
val maxDists = aggToMatrix(totalVariationDistances, s => s.max)
val minDists = aggToMatrix(totalVariationDistances, s => s.min)
val meanDists = aggToMatrix(totalVariationDistances, s => s.sum/s.size)
maxDists: Seq[Seq[Double]] = Vector(Vector(0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0), Vector(1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0), Vector(1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0), Vector(1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0), Vector(1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0), Vector(1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0), Vector(1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0), Vector(1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0), Vector(1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0), Vector(1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0))
minDists: Seq[Seq[Double]] = Vector(Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1111111111111111), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Vector(0.1111111111111111, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0))
meanDists: Seq[Seq[Double]] = Vector(Vector(0.0, 0.8095238095238095, 0.9595959595959596, 0.9902676399026764, 0.9975932611311673, 0.9992509363295881, 0.999785112600997, 0.9999212988949179, 0.9999637822897781, 0.9999828303297199), Vector(0.8095238095238095, 0.0, 0.7878787878787878, 0.948905109489051, 0.9879412354592861, 0.9962818797931157, 0.9988235824354509, 0.9995277161089322, 0.9997797200520187, 0.9999006682605777), Vector(0.9595959595959596, 0.7878787878787878, 0.0, 0.7675182481751823, 0.9456465498353374, 0.9839165434703036, 0.9949339063117071, 0.9979342365055828, 0.9990263999070653, 0.9995453963182023), Vector(0.9902676399026764, 0.948905109489051, 0.7675182481751823, 0.0, 0.76569626045364, 0.9313390461281573, 0.9778513135067101, 0.9912980359273889, 0.9959874223204074, 0.998122961240012), Vector(0.9975932611311673, 0.9879412354592861, 0.9456465498353374, 0.76569626045364, 0.0, 0.709855350171288, 0.9069532060773262, 0.9635540738278638, 0.9834585383266629, 0.992240400344003), Vector(0.9992509363295881, 0.9962818797931157, 0.9839165434703036, 0.9313390461281573, 0.709855350171288, 0.0, 0.6846310864524279, 0.8780374739844375, 0.9450969813814848, 0.9743411917827652), Vector(0.999785112600997, 0.9988235824354509, 0.9949339063117071, 0.9778513135067101, 0.9069532060773262, 0.6846310864524279, 0.0, 0.6217124078957903, 0.8326361431674983, 0.9226862715145945), Vector(0.9999212988949179, 0.9995277161089322, 0.9979342365055828, 0.9912980359273889, 0.9635540738278638, 0.8780374739844375, 0.6217124078957903, 0.0, 0.57277338181383, 0.8064805024139434), Vector(0.9999637822897781, 0.9997797200520187, 0.9990263999070653, 0.9959874223204074, 0.9834585383266629, 0.9450969813814848, 0.8326361431674983, 0.57277338181383, 0.0, 0.5653338124646969), Vector(0.9999828303297199, 0.9999006682605777, 0.9995453963182023, 0.998122961240012, 0.992240400344003, 0.9743411917827652, 0.9226862715145945, 0.8064805024139434, 0.5653338124646969, 0.0))

Each model is a mapping {key: {value: probability}} where key (lagKey) is a sequence of reversals and non-reversals of length m, value is a sequence of trends of length n and probability is the estimated probability that value is observed immediately following key.

Hence, for any two models A and B, we can calculate the total variation distance between the mappings {value: probability} for a given key in the union of the keys for A and B. If a key is not present in one of the models, the total variation distance is 1 for that key.

In the matrix below, the (i,j)-th position is the arithmetic mean of the total variation distances for all keys in the union of the keysets. The matrix is symmetric with the smallest model in the top row and leftmost column and the largest model in the bottom row and rightmost column. All diagonal elements are 0 since the total variation distance from one model to itself is always 0.

If there are three models labeled M1, M2, M3 and Vi_j is the arithmetic mean described above, the matrix is

\[ \begin{matrix} V_{1,1} & V_{1,2} & V_{1,3} \end{matrix} \] \[ \begin{matrix} V_{2,1} & V_{2,2} & V_{2,3} \end{matrix} \] \[ \begin{matrix} V_{3,1} & V_{3,2} & V_{3,3} \end{matrix} \]

As one can see, the models differ a lot from each other, suggesting that the estimate can still be improved given more data.

printMatrix(meanDists)
0.000 0.810 0.960 0.990 0.998 0.999 1.000 1.000 1.000 1.000 
0.810 0.000 0.788 0.949 0.988 0.996 0.999 1.000 1.000 1.000 
0.960 0.788 0.000 0.768 0.946 0.984 0.995 0.998 0.999 1.000 
0.990 0.949 0.768 0.000 0.766 0.931 0.978 0.991 0.996 0.998 
0.998 0.988 0.946 0.766 0.000 0.710 0.907 0.964 0.983 0.992 
0.999 0.996 0.984 0.931 0.710 0.000 0.685 0.878 0.945 0.974 
1.000 0.999 0.995 0.978 0.907 0.685 0.000 0.622 0.833 0.923 
1.000 1.000 0.998 0.991 0.964 0.878 0.622 0.000 0.573 0.806 
1.000 1.000 0.999 0.996 0.983 0.945 0.833 0.573 0.000 0.565 
1.000 1.000 1.000 0.998 0.992 0.974 0.923 0.806 0.565 0.000 

To reference dbfs from markdown, use /files instead of /FileStore

Trend Calculus on oil prices

trendcalculusoil

Performance of Trend Calculus Markov Chain model

trendcalculusmcmodelperformance.png

ls dbfs:/FileStore/shared_uploads/
path name size
dbfs:/FileStore/shared_uploads/adlindhe@kth.se/ adlindhe@kth.se/ 0.0
dbfs:/FileStore/shared_uploads/ahlsen@math.su.se/ ahlsen@math.su.se/ 0.0
dbfs:/FileStore/shared_uploads/alek@kth.se/ alek@kth.se/ 0.0
dbfs:/FileStore/shared_uploads/amanda.olmin@liu.se/ amanda.olmin@liu.se/ 0.0
dbfs:/FileStore/shared_uploads/amirhossein.ahmadian@liu.se/ amirhossein.ahmadian@liu.se/ 0.0
dbfs:/FileStore/shared_uploads/axel.berg@arm.com/ axel.berg@arm.com/ 0.0
dbfs:/FileStore/shared_uploads/caylak@kth.se/ caylak@kth.se/ 0.0
dbfs:/FileStore/shared_uploads/ciwan@kth.se/ ciwan@kth.se/ 0.0
dbfs:/FileStore/shared_uploads/dali@cs.umu.se/ dali@cs.umu.se/ 0.0
dbfs:/FileStore/shared_uploads/emilio.jorge@chalmers.se/ emilio.jorge@chalmers.se/ 0.0
dbfs:/FileStore/shared_uploads/fabiansi@kth.se/ fabiansi@kth.se/ 0.0
dbfs:/FileStore/shared_uploads/hugower@kth.se/ hugower@kth.se/ 0.0
dbfs:/FileStore/shared_uploads/johan.oxenstierna@cs.lth.se/ johan.oxenstierna@cs.lth.se/ 0.0
dbfs:/FileStore/shared_uploads/johannes.graner@gmail.com/ johannes.graner@gmail.com/ 0.0
dbfs:/FileStore/shared_uploads/ka_barber@live.co.uk/ ka_barber@live.co.uk/ 0.0
dbfs:/FileStore/shared_uploads/karl.bengtsson_bernander@it.uu.se/ karl.bengtsson_bernander@it.uu.se/ 0.0
dbfs:/FileStore/shared_uploads/linn.ostrom@math.lth.se/ linn.ostrom@math.lth.se/ 0.0
dbfs:/FileStore/shared_uploads/martin.andersson@math.uu.se/ martin.andersson@math.uu.se/ 0.0
dbfs:/FileStore/shared_uploads/niklas.gunnarsson@it.uu.se/ niklas.gunnarsson@it.uu.se/ 0.0
dbfs:/FileStore/shared_uploads/pavlo.melnyk@liu.se/ pavlo.melnyk@liu.se/ 0.0
dbfs:/FileStore/shared_uploads/petterre@kth.se/ petterre@kth.se/ 0.0
dbfs:/FileStore/shared_uploads/robert.gieselmann@gmail.com/ robert.gieselmann@gmail.com/ 0.0
dbfs:/FileStore/shared_uploads/scenegraph_motifs/ scenegraph_motifs/ 0.0
dbfs:/FileStore/shared_uploads/vpol@kth.se/ vpol@kth.se/ 0.0
displayHTML("<img src='/files/shared_uploads/johannes.graner@gmail.com/trend_calculus_oil.png'>")
displayHTML("<img src='/files/shared_uploads/johannes.graner@gmail.com/trend_calculus_mc_model_performance.png'>")
displayHTML("<img src='/files/shared_uploads/johannes.graner@gmail.com/trend_calculus_mc_model_performance.png'>")