Transparency of Values in a Decentralised World

The Seratio Ecosystem is aided by AI to assess the complex analytical data available and to steer consumers, becoming their online friend and guide

Hyper-Reality by Keiichi Matsuda

Two important trends are making corporates re-think their strategies and relevance in our future economies.

Transparency

The hallmark of this decade has been the growth of transparency as an instrument to  openly display governance in our public offices, demonstrate corporate intentions, reflect the goodness within our products and even prove our own values to each other.

The public sharing of our actions and decisions is rapidly accelerating, particularly with the under 35’s. Social media and mobile apps  bring our lives under closer scrutiny, for many now on a daily basis. The things we like, where we socialise what we eat are all commonly shared on social media.  But what about the transparency of information to consumers?  Whilst price comparison sites and convenient middlemen layers have become commonplace, we remain sadly lacking in coherent, simple signposting for most of the goods and services we buy. To navigate our often intricate lives we readily embrace any kind of ‘Go-NoGo’ filters that allow decisions to be guided with least effort.

Arguably, consumer data, the value sets and preferences may become the most valued asset. As the trend for transparency continues, the analytics behind those choices and preference will shape the retail market push.  As important the trend of information to consumers will shape the markets, just as comparison sites have influenced and taken ownership of consumers in the insurance space, information and data providers of the provenance of goods have the same opportunity, to guide, advise and own the relationship with those consumers. Have goods been manufactured with slavery within their supply chain?  Are the suppliers and supply chain manufacturing those goods adding value to the world or detracting total value by stealing from their supply chain, or stealing from the environment?

The world is an increasingly complex place and consumers crave simplicity and convenience.  For this they are willing to sacrifice, security, privacy and price. At the same time, expectations of free services has never been greater.  The traditional model of “Make it Pay”  has been turned upside down by the media giants of the last two decades as they offer an app for our lives for free.

So it is somewhat surprising that the consumer “values” data space remains largely untapped. What is not just in our wallet but rather in our heads, our thoughts and our hearts remains opaque to retailers and suppliers as they seek greater levels of optimisation.

Decentralisation

The antidote to centralisation is perhaps an even greater challenge to a traditional corporate organisation to navigate. How to remain relevant as a central player and controller in the non-centralised world the new public seem to crave.  The traditional model of central controller is broken down in the de-centralised world of the blockchain. This gives rise to new models, new expectations and new philosophies.

The inside out model, fails completely when one tries to implement a blockchain solution with a focus on one key stakeholder. Distributed Ledger Technology ‘DLT’ was created for a peer to peer environment, deliberately to eliminate the need for a central controller. Whilst blockchain is hailed as the next revolution and an even bigger impact than the internet, the largely centralised corporate world is still struggling to create meaningful, viable, value added applications as the models strike at the very heart of its operations.

Solution

The Seratio Ecosystem has brought together a model which addresses these two key challenges.

The original start point for the ecosystem was born from academic research into the movement of non-financial value and the understanding of a total value model, taking into account social, environmental and governance measures to assess the impact of a person, project, organisation, city, region or country. As with any measurement system, once metrics were established, agreed, recognised and used, then focus and improvements may follow. The DLT of blockchain technology presented  an ideal opportunity to move from the measurement of non-financial value to the transaction of non-financial value.

Communities of people who share common values and share a trading economy have already formed their own currencies, outside of the traditional boundaries of government and regulation and without any central controllers; organic growth. Value sets are recognised, shared and transacted.  Community members prefer to seek out those with aligned values, those who share the currency and demonstrate their value sets. This has led to crypto-currencies imbued with values, carrying the attributes of the communities that created them making it more transparent both to those who prefer to operate within these large cohesive, vibrant and transactable communities as well as those who wish to supply and target them with loyalty and rewards programmes.

Recognising the value sets or ‘provenance’ of individuals the Seratio Ecosystem also builds in the provenance of goods and services. The complex system is filtered down to a simple score to direct and guide buyers. In essence, to enable them to buy the things they value, which may be pro a particular value set, for example women or students whilst being steered away from those goods and services which fall outside of their individual values, for example anti-slavery or environmental.

The Seratio Ecosystem is aided by AI to assess the complex analytical data available and to steer consumers,  becoming their online friend and guide.  By directing purchases in the online space and on the high street value aligned purchases can be easily delivered, with no effort on the part of the consumer.

The use of blockchain and cryptocurrency record and give transparency to the value choices. In the same way that social media has a positive ground-up impact and empowerment,  the blockchain has a similar effect demonstrating aligned values and for the first time linking that to the economic activity those values support.

Number by Robert Hloz

 

Next Steps

To scale the models,  we need to build systems and put into practice the ecosystem platforms within a mass market. We welcome new partners who are interested in transforming their impacts and become a key player in this brave new world.

Who will step up and become the Google of the Blockchain World?

SERATIO AI BOT

Introduction

Core values are the fundamental beliefs of a person or organization. They act as a guiding principle that help people right or wrong. While walking through a commercial street, have you ever wondered about an app that will notify you of discounts in your preferred retailer, or a product, that align with your values? This is one of the featured functions of the new Seratio AI Bot. On the other hand, it could also help the retailer identify the customers who align with their values and incentivize.

AI Bot

The main objective of the AI bot to help its users make informed decisions based on their values. By building predictive models and using AI algorithms, AI bot will be able to learn user’s preferences and behaviour. This will help the bot identify their next course of action and provide suggestions.

AI Bot receives the required data from multiple core segments of the seratio ecosystem. They are as follows

  1. SAPI

Seratio API is the software that implements impact assessment & tracking (non-financial value analysis). It is responsible for S/E scores calculation, S/E certificate awards, and has a build-in S/E translator enabling engagement and use of all other non-financial metrics.

From a corporate point of view it can process and combine different data sets, including monitoring product provenance, modern slavery conditions checks, Proof-of-[…] metrics, and convert them all into a single-number non-financial attribute – S/E score.

     2.  Seratio Platform Wallet

Anonymity, the fundamental idea behind blockchain. Although it provides high security, it breaks trust in blockchain. CCEG’s solution to the problem is to maintain the anonymity while bringing in non-financial intangible attribution (social Value) of each transaction. the fundamental functionality of the platform is to add an extra layer of value to every transaction

Here every user has the option to enter their basic, financial and social information, that helps in generating the S/E (Social Earnings Ratio) certificate through SAPI.

The details which are used to calculate the SE Ratio of an individual include:
• Country
• Number of people dependent on the individual
• Value of asset
• Environmental considerate decisions
• Amount spent to help others
• Raised/Helped Raising Amount to help others
• Number of people positively influenced by the individual

In the wallet we can set the SER preferences to do/restrict transactions or interactions with organisations, products, projects and people. Which means, the individual can initiate a transaction only if they satisfy the SE Ratio cut-off threshold mentioned by the Individual.

shutterstock_252014710

     3. Provenance

The Provenance Monitoring is an S/E based complex analytical tool which allows tracking provenance of products, companies, services etc. In terms of product provenance, the platform will track both financial as well as non-financial attributes at every stage of the supply chain. This information is then submitted to blockchain through smart contract. Here it integrates with seratio blockchain platform. When the product moves from one part of the supply chain to another, the information is carried forward and verified with the corresponding information in blockchain. This creates a secure, shared record of exchange for each product along with specific product information.

When the product is complete, the collective information is processed by SAPI and will be able to provide it aggregate social value along with its other tangible information.

     4. Rewarding Body

Rewarding body functionality supports organisation to create social value by earning tokens through impact that users make through their personal beliefs or social activity. When paid staffs are rewarded with normal currencies, the social currencies can be distributed to its unpaid supporters/helpers. May it be a reward for referral made by valued customers or for the volunteering for a social cause, rewards can be given to thank the other. Users are rewarded with specified tokens for their social activities based on the rewarding policy set by rewarding body. The Rewarding body will then transfer the tokens to the user directly using user’s QR code.

     5. Microshare exchange and Retail

The MICROSHARE is a unit of non-financial, intangible value gained by people through many activities their personal beliefs or social activity (volunteering, charity, carers, etc). The first batch of the microshares are the SER microshares, which was distributed to all who contributed to the ICO. The other altcoins will receive their microshares as well. At this point, microshares will be tradable for another microshare at the microshare exchange.

Soon users will be able to acquire microshares from rewarding body and will be able to buy discount vouchers from retailers. These retailers will empower the community and their values, that they believe in, through discount policies. The discount policy will contain

• Minimum SER value needed to avail discount
• Type of microshare
• Amount of microshare
• Discount offered

Users would be able to purchase the discount in advance and produce it at the retail shop or pay at the counter.

Key outcomes for regular users:

man presses interface

The AI Bot can give suggestions or predictions based on individual’s SE Ratio preferences.

Depending on the user’s values and behavioural patterns, some of the model prediction scenarios are as follows:

• If an individual wants to buy a product, AI Bot can suggest a list of companies, which are providing that product based on the SE score preferred by the individual.

• If an individual wants to improve his SER value, AI Bot can suggest a list of NGOs or Rewarding policies based on their values.

• AI Bot can suggest the available offers and discounts. These offers can be availed using Microshares.

• It should also suggest ways to earn Microshares by integrating itself with rewarding body features in Seratio platform. This should also depend on proximity and preferences of the Customer.

• Different purchase patterns can be generated by the AI Bot, based on previous purchase history details and can suggest the possible future buying nature and other related activities by the individual.

 

Key outcomes for Organisation/Company:

Similarly, the AI Bot can give suggestions or predictions based on an organization’s preferences and values. Some of the key suggestions could be seen below.

word cloud - community

• If a company wants to launch a product and they want to sell it at a targeted volume of customers, then our AI Bot can suggest the SER value that company should hold to reach the targeted volume of customers.

• Also, if a company want to improve their SER value, our AI Bot can suggest the areas where that company’s activities can be rectified or corrected to switch over from a low SER to a preferable SER value.

For example: Our Bot can suggest improving the wages, if currently one of the reason for that company’s low SER value falls in Modern Slavery category.

• For a manufacturing company, looking to improve their SER value, our AI Bot can suggest the list of suppliers whose SER value is higher with whom the company should associate and a list of suppliers with whom the company should not associate at any stages of product manufacturing.

• To improve their CSR activities, the Bot would be able to provide a list of NGO’s or rewarding body and provide insights into their current and historic social activity. The companies could then either collaborate with these organizations or they could conduct their own activities.

 

Building Predictive Models – Our Approach

Predictive models predict the future behaviour of a user. Predictive modelling is a group of statistical algorithms, which when applied on provided data, outputs a mathematical function/equation/a logical program that helps to predict the outcomes. To understand our approach, a sample steps taken are provided below

Stages of building a Predictive Model:

AI Bot

A. Data Cleaning:

This stage includes the process of detecting, correcting or removing the inaccurate records from the given data source. This is important to maintain the quality of data.
This stage includes:
• Various kind of data importing scenarios like importing various kind of datasets (.csv, .txt), different kind of delimiters (comma, tab, pipe), and different methods (read_csv, read_table)
• Getting the very basic information, such as dimensions, column names, and statistics summary
• Getting basic data cleaning done by removing NAs and blank spaces, imputing values to missing data points, changing a variable type etc.
• Creating dummy variables in various scenarios to help modelling
• Generating simple plots like scatter plots, bar charts, histograms, box plots etc.

B. Data Wrangling:

This is the process of transforming and mapping data from ‘raw data’ form into a desired format using merging, grouping, concatenating etc. for better decision making and analytics.

C. Explore the data with Python

In this step the data which is saved in database from the given datasource, is loaded to Python for further processing. At this stage the data is converted to dataframe, another data structure using a Python package named Pandas.

The following is an illustrative representation of a dataframe with details of a user for the calculation of SE Score.

Capture

After this, we will access the required columns from the dataframe by removing the unwanted ones.

D. Creating and training a model

This is the step where the Model is created and trained. For prediction, first we have to find a Function/Model that best describes the dependency between the variables in our dataset, known as Correlation.

We use Linear Regression Algorithm to create the model if the output to be predicted is a continuous variable. Whereas we use Logistic Regression Algorithm if the output variable is a binary or categorical variable.

After this, the dataset is split into training and testing datasets for the purpose of Training the Model.

• The training dataset is the one on which the model is built. This is the one on which the calculations are performed and the model equations and parameters are created.

• The testing dataset is used to check the accuracy of the model. The model equations and parameters are used to calculate the output based on the inputs from the testing datasets. These outputs are used to compare the model efficiency.

Training the model means fitting the created model to the training set of data. At this stage, we have created a model which is trained with the training dataset and is ready to handle to test dataset.

E. Prediction

This is the stage where the prediction process happens. The trained predictive model will be ready to give the result/prediction using the test dataset we created in the previous step. The default function predict() in Python is used to predict the result we expect.

In this stage the error between our test predictions and the actual values are calculated for checking the efficiency of our created model.

businessman connect to social network

Explaining the approach using an example:

We are considering a scenario where we want to predict the number of sales a retailer will have on a future date depending upon certain previous factors such as offer/discounts on particular category of products provided for customers who will buy using Microshares.

A. Data Cleaning

The provided data is prepared by removing/updating the inaccurate entries to maintain the quality of data.

B. Data Wrangling

In this stage, we will merge/group/concatenate the data if needed for our requirement when analysing our data source.
•  Create a Database in MongoDB
•  Create a collection named sales_data
•  Save the sample data to sales_data

C. Explore the data with Python

•  Load data from database to Python
•  Import data source and convert to pandas dataframe
• Get the required columns from dataframe by avoiding the unwanted ones.

D. Creating and training a model

We use Linear Regression Algorithm to create the model as described in the scenario. For creating the model, we will follow the below steps:

•  Store a variable SalesCount, we will be predicting value of
•  Generate the training set, train
•  Select anything not in the training set and put it in the testing set, test
•  Print the shapes of both sets
•  Initialise the model class, linear_model
•  Fit the model to the training data

 Now we have trained a Linear Regression Model named linear_model.

E.  Prediction

At this stage, we can use the trained linear_model to predict the required output/results using the test set test.

•  Generate our predictions, linear_predictions for the test set
•  Compute error between our test predictions, linear_predictions and the actual values, test

Now, in this stage, we have created a model named linear_model  with Python which can predict the Sales Count of a retailer in a future day, by analysing various factors. For example, if there is an offer/discount on a particular ‘Category’ (Clothing, Sports Accessories etc.) on that day or whether the customer will be using ‘Microshares’ to buy and so on.

 

Conclusion:

This Predictive Model can be used by our AI Bot to help the retailers predict the future Sales Count depending upon the various factors mentioned. Likewise, by building different prediction models in our AI Bot, the main objectives like helping users make informed decisions based on their values can be fulfilled.