Tuesday, September 26, 2017

IoT Touches An Entire Suite of ERP

In a recent trip to Mexico City, Oracle rolled out three IoT SaaS Apps. IoT is synonymous with IaaS/PaaS, so it is refreshing that Oracle took a different approach to focus on SaaS instead of "thingy" IaaS/PaaS.

What makes Oracle's approach interesting is that what is done with the IoT data. It can be piped to an entire array of ERP apps. Supply Chain Management. Customer Experience Cloud Suite. Human Capital Management. Enterprise Resource Management. Enterprise Performance Management. 

Think of all of the business transformation that can happen with real time data in all of these siloes – customer sentiment demand forecasting, employee motivation management, feedback driven development, business asset performance analysis.

https://blogs.oracle.com/iot/oracle-iot-cloud-apps-rollout-in-mexico-city

Wednesday, May 10, 2017

Digitial Transformation (DX) Enables Customer Experience (CX) and Business Transformation (BX)

There is much talk about the focus on CX - Customer Experience - as the only differentiation left in this hyper competitive world of products and services. And to support that Experience, consultants will say that it will require BX  - Business Transformation. BX will require the right process and technology. But what will power BX? Better data? Better processes? This is provided by DX - Digital Transformation.

Take the case of a mobile phone call. My friends at Verizon are being laid off as T-Mobile, AT&T and Sprint gain market share. How can DX help Verizon to gain in CX and BX?





 
In order to improve CX, a modern BX is needed. But to have a modern BX, a DX needs to enable BX.



Total customer experience (CX) is key to winning market share back. Ben is making a call to Sally. His call reception is poor and eventual drops. He is unhappy.  Ben spends about $100 / month on his mobile phone, and he is constantly bombarded by offers to switch. With the right DX, the call at the source is tracked all the way to end (device information, call information). The call path is tracked (tower information). And the customer is tracked (customer information).

CX : The complete customer experience. Product. Service. Support

BX : The business transformation : real time information, insight analytics, global collaboration, cross silo sharing, machine learning/prediction/recommendation

DX : The technology that enables BX to grow CX. Cloud, single source of truth, multi-mode communication,  API access to cloud services, fast access to database

With DX, Ben's dropped call is known by customer service. When Ben calls, the CRM system already predicts why Ben is calling, shows suggested ways to appease him - tell him detail information about the dropped all and what will be done about it. With BX, a account representative can make a follow up "pre-emptive" call and offer Ben credit to stay. And while the customer service agent is talking to Ben, a technician (who happens to be up on the hill with the broken tower) and service it and save 4 hour round trip drive up the hill.

How is this fluid interaction of data and decisions possible? DX.

Tuesday, April 4, 2017

A Single Supply Chain Data Platform For All

The supply chain value chain is a huge network of raw data, storage systems, analysis framework, and end users. Traditional supply chain systems are siloed for each function - ERP, S&OP, APS, TMS, WMS, MES, etc. Each has its own little world of data, storage, retrieve, and analysis.

Great for the yesteryear of "waterfall" data consumption.  But times have changed.

For example, let's focus on a brand (like Apple).  Apple does not own a single factory. It relies on a global network of suppliers (TSMC, Samsung - a frenemy) to source its global network of contract manufactures (Foxconn).  With a global network of suppliers and demanders, a company is now exposed to risks around the world, 24/7. On the supply side, a fire, a strike, geo-political instability, war can disrupt a JIT flow of parts. On the demand side, a negative post by an influencing blogger and render a demand model obsolete in seconds. A global network requires real time access to data across the planet. Traditional systems was great a performing a singular task - optimize factory usage, minimize inventory carry cost, reduce shipping time - but has become rigid in the modern world of social media and globalization.  Big Data can help. Alot.


Adopting A Big Data Platform For Supply Chain




How do we transform a traditional supply chain system into Big Data? Starting with raw data, it needs to support a variety of  sources (IoT devices, social media, suppliers, transportation, inventory, factory), support many formats (flat files, CSV, JSON), and delivered in many different ways (e-mail, ftp, Restful API, GPRS).  It needs to reduce the need to model the data. Traditional relational database design required careful profiling, planning, and design of data models. Typical steps include creating an ERD, using Crow's Foot notation. Care was taken to follow the NF as to optimize the database.  Profiling was done to find the optimal declaration of datatypes. And EDW that had to pre-know queries to avoid expensive JOINS. Big Data - with its support for unstructured data and mapping techniques - will help greatly with all of this.   Data ingestion needs to be automated - using cloud CI/CD automation tools such as Jenkins to load data into AWS S3. Data analysis needs to support decision maker with descriptive (what happened in the past), predictive (what might happen in the future), and prescriptive (what should I do).


Monday, March 13, 2017

API Economy - An Introduction









What Is An API?

API stands for Application Program Interface. APIs allows programmers to write a mobile app to look up your monthly gas bill from stored in the cloud database. In other words, the cloud database provides database information through the API.





How Does API Work?

So how does the gas bill  app access the data in the database? Through the database API. The API is a clear set of software functions (provided by the database) to allow another program (the address book app) to retrieve customer information. Simple examples of API functions can be get_bill().
 
Let's assume that PG&E (a utility company) has 4.3M gas customers. The information on the 4.3M customers are stored in a database for 1. Billing (I need to send a monthy bill to customer Nat Gus) 2. Service (customer Nat Gus has 2000 SF house, has a gas water heater, gas stove range).

So far, this example is one where the API user (address book app) and the API provider (the customer database) is in the same company PG&E. But the API (to the customer database) can also be MONOTIZED. That is, you can sell the right to use the API.


Another Example - Making Money from API

At the business level, API allows you to finally make money that was not possible before API. it allows companies with 1. data (customer data, weather data) or 2. service (send us data, we return insight) to SELL. Another way of making money from API is to allows access to your PLATFORM (Amazon, Alibaba, eBay) and you charge a transaction fee. Let’s look at the ways:



Data To Sell Via API And Make Money From Your Data

The WeatherChannel has been collecting weather information around the globe. You are an insurance company that want to calculate th risk of flood in Oslo. The WeatherChannel will sell the insurance company access to the weather data via. Once the payment is made from the insurance company to the WeatherChannel, the WeatherChannel will give. The insurance company an API KEY, and the API documentation. 


Service To Sell Via API And Make Money From Your Software/Know How

You have developed an algorithm to predict who will drive recklessly and cause an accident. You want to “sell” this algorithm. One way to do this is to provide a API to your algorithm. Insurance companies will pay you money, you provide an API KEY and API documentation, they send data (age, race, family incoming, zip code) etc, the you can figure out if they will cause an accident. Service can be more than algorithms - it can be a team of analysts in India doing research.

Grow The Platform API And Make Money From Transactions


You have an e-commerce platform. On this platform, sellers can post goods for sale. Buyers can shop and buy. You want to grow this platform - the more sellers you have, the more buyers will come. You provide a FREE API to verified sellers for them to post their goods for sale. When an item sales, you get money. 

Conclusion

An API enables business to safely share or sell their data on the internet. 

Thursday, March 9, 2017

Seeing New Insights From Data - Thanks to GPUs


Geo-Data + Social = New Business Intelligence & Insights


Whereas much of the enterprise business analysis has been focused on RDBMS and Big Data sales transactions to test hypothesis and create reports, the adoption of geo-data on business transactions is an area of huge opportunity. Several companies and industries have already adopted geo-data and are reaping financial benefits. For example, UPS is using geo-data to optimize truck delivery routes, aiming for as many right turns at traffic intersections as possible. This will result in an anticipated $50M saving per year. 




Enterprise Insights Exploration of Geo-Data + Social







If you looked at your favorite social media apps, you will find that they want to track your location. These apps take your location—combined with what you are doing, how you feel, who you are with, and why you are there—provide invaluable and difficult to obtain insights about you. For example, if on January 21, 2017, between 2PM-8PM, you were at location 37.79° N, 122.39° W, and you tweeted that you were feeling happy and civic, you were probably part of the Women’s March in San Francisco. Hence, a certain marketing profile can be built up on you for target marketing.




Enterprise Insights   Exploration Hampered by CPUs




A business analyst, seeing the value of geo-data, wants to perform an ad-hoc query. She has data from Women’s March with an estimated 4 million marchers nationwide. She can query who was at the start location of the Washington D.C. March (38.88° N, 77.01° W), at the starting time (1:15 PM EST), and Tweeted or Liked positively. This is the profile of a enthusiastic, conscientious person. The analyst can also query who was at the end location of the March (38.89° N, -77.03° W), but at the starting time of the March— perhaps a supporter or reporter.  Acting on the speed of thought, the analyst wants access to billions of rows of data, to draw a perimeter of the map to localize around the start of the March, focus on the start time, and filter by contextual data. And after that, try again with another set of criteria so that she can constantly refine her hypothesis to reach a conclusion.  But currently, each click will cause minutes or even hours of calculations before results are seen. This is due to the nature of CPUs – limited number of cores, memory speed, and the types of instructions it excels at.





GPUs to the Rescue of CPUs

Querying a database requires processing cores and fast memory. CPU based servers is limited up to only 22 processing cores and fairly fast memory. CPUs need to be clustered together to be able to serve the queries of billions of rows of data. Another type of processor, called GPU, has thousands of cores and very fast memory. The cores in the GPU process data in parallel and pass data extremely fast to memory. GPUs are so powerful that a single GPU server can sometimes replace multiple clusters of CPU servers. GPU can save money, reduce labor, lower energy consumption, and reduce space over CPU.




G-DB Harnessing the Power of GPU for Map Point Exploration



Whilst GPU is a great match for looking through billions of records in milliseconds, a database optimized for GPU is needed. That’s where G-DB comes in. G-DB offers two synergistic products – G-DB Server and G-DB Visual. G-DB Query is a GPU optimized database. It is an in-memory, columnar data highly optimized to harness the power of thousands of cores in the GPU.  Every SQL query that you submitted is broken down and re-targeted to run in parallel on thousands of GPU cores. That’s how we are able to return queries on billions of rows in milliseconds.  But the magic doesn’t stop there. Synergistically, GPU is also ultrafast at drawing the output of the query results. This is where G-DB Visual comes in. It renders the results of your queries immediately – so that you can use your eyes to help you brains to discover insights immediately.



Conclusion

Transaction, geo-data, and social media combined will enable insights into people not possible before.  Processing billions of rows of this type of data will be slow and/or expensive on a CPU based system, making this valuable data inaccessible. But GPU based systems, like G-DB, can handle this type and size of data with ease. With G-DB, not only can you gain insights at the speed of thought, you have ultrafast high fidelity visuals to match.



Wednesday, March 8, 2017

A Look At Google Cloud : Enterprises Shifting Traditional RDBMS To Cloud

The latest talk on BigData has made traditional RDBMS relegated to a dark corner. But my experience with database is that RDBMS still plays a big role in enterprise operations : either SMBs that don't need the power of BigData (the barrier to entry can be high), or large enterprises are perfectly happy with traditional Data Warehousing/ETL/BI flow.

There are a large number of cloud RDBMS offers : Amazon AWS, Microsoft Azure, and Google Cloud. Google Cloud SQL offers an easy way to quickly setup a MySQL database in the cloud. Without much instruction reading, I was able to intuitively set on up in 5 steps.



1. Create A Google Cloud  Project

All Google Cloud project starts with a Project name. This is the master that can be used to control all infrastructure instances. In fact, it has its own shell (called gcloud) that can be used to programmatically  spin up, shut down, clone, scale infrastructure. For now, I will focus on just creating a project called "projectsource".


2. Create A Google Cloud SQL Instance


A SQL Instance is an easy way to start a  MySQL server. Nothing is really required of the user. I chose default settings for all settings except password.



3. Create A Google Cloud Bucket, Import SQL Query Into Bucket

My SQL database table schema creation and population Python script was written and debugged on my laptop. This could have have been done easily in Google Cloud, but that's a topic for another day. After creating a Google Cloud bucket, I import my SQL query into the bucket.



4. Import SQL Query From Bucket to Google Cloud SQL Instance

Once the SQL query is in a bucket, it can be easily imported into the Google Cloud SQL instance.
The only modification I had to make to the SQL query is that I must "USE" and database. So in the SQL query file (before it was uploaded into the bucket), I  inserted at the beginning of the SQL file  "CREATE DATABASE dbSource; USE dbSouce;"




5. Connect Google Cloud Cloud Shell To SQL Instance and SQL Query To Your Hearts Desire!

Pressing the "Google Cloud" button at the button will invoke the Google Cloud Shell. This is where you will be prompted for your MySQL password.






Conclusion:

Google Cloud SQL enables a quick setup of a MySQL in the cloud, a easy way to upload a SQL query file, then use the full power of SQL in the cloud.




Tuesday, February 28, 2017

Intro To Google Cloud Compute Via Sudoku Solver

Solving Sudoku With Google Cloud Compute


Introduction 

Sudoku is popular puzzle game, available in puzzle books, newspapers, and even as a mobile app. I recently downloaded a Sudoku app and spent hours playing it. As an average player, I sometimes become frustrated that I cannot solve it quickly enough and have a strong desire just  to solve the puzzle immediately. Using Google Cloud Compute, I was able to click a few buttons, type a few commands, and see an instant solution. 




Cloud Infrastructure (IaaS) Market

The cloud infrastructure market has become a money maker for Amazon, where its AWS division racked up $3.53B in just Q4 of 2016 alone (1). Gartner estimated that the IaaS market will be $22B for 2016 (2). Gartner ranks Amazon as #1 in cloud infrastructure, followed by Microsoft and Google (3). Google recognizes that is is #3 and is actively trying to catch up. They hired a VMWare veteran to do this (4).


Google Cloud Solve Sudoku Puzzle

Wishing to solve the above Sudoku puzzle, I found an open source Sudoku solver from Bob Carpenter (5). It is written in Java. To use his Sudoku solver, I need a computer to compile and run his Java code. I have decided to give Google Cloud a shot.



10 Easy Steps To Using Google Cloud Compute

1. Apply For A Google Cloud Compute Account. For Some, You Will Be Offered A $300 Credit To Try It Out. Google Promises That They Won't Charge Without Your Permission. 



2. Log Into Your Google Cloud Platform Console. 




3. Create A Google Cloud Project Using The Browser. The Project Is Called "projsoduko9x9" It Will Solve A 9x9 Sudoku Matrix.








4. Spool Up A  Google Cloud Compute Engine Using The Web GUI.




5 . Active The Google Cloud Shell. Once Activated, You Will Be In The Project Shell. This Is Where You Will Create Instances Of VMs.




6. Then Use The Google Cloud Shell (Still In The Web GUI)  To Provision A Virtual Machine Instance In The Project. I Have Named My Instance "instsudoku9x9".




7. Using Google Cloud Shell (CLI) To List VM Instances And SSH Into "instsudoku9x9".



6. Copy And Paste Carpenter's Apache Sudoku Java Source File (5) Into The VM Instance. I Did This By Using "vi" Text Editor To Edit A New File "Sodoku.java", Then Pasting The Source Code. 






7. Install Java SDK Using Ubuntu Apt-Get. 



8. Java Compile (javac) And Run (java) Sudoku Solver. If You Look At The Unsolved Sudoku From The Above, Row One Already Has 8,3,7,4,6. The Program Accepts A (Row, Column, Value) Notation, And Hence 008, 023, 037, 074, and 086 Are The Inputs. You Can Repeat This For Rows 1-8. Note : I Had A Typo In The File Name. Was "Soduko.java", Now Corrected To "Sudoku.java".





9. Delete VM Instance Using Google Cloud Shell. The Shell Is Somewhat User Friendly - I Missed Typed A Command Argument And The Shell Recommended The Right Command.





10. Delete Google Cloud Project "projsudoku9x9".





Conclusion

We had a problem to solve : a Sudoku puzzle. To solve the puzzle, we need compute resources. Google Cloud  gives us instant resources via their infrastructure (IaaS). We were able to compile a Sudoku solver, input the puzzle state, and have the solver spit out a solution. 


References

(1)http://venturebeat.com/2017/02/02/aws-posts-3-53-billion-in-revenue-in-q4-2016-up-47-from-last-year)
(2)http://www.gartner.com/newsroom/id/3188817
(3)https://thenextweb.com/offers/2016/03/11/amazon-web-services-dominates-cloud-services-market/#.tnw_J0QPissD
(4)https://www.forbes.com/sites/alexkonrad/2015/11/30/what-diane-greene-lessons-at-vmware-tells-us-about-google-cloud/#6aa1ed0b120d
(5)https://bob-carpenter.github.io/games/sudoku/java_sudoku.html

Tuesday, February 14, 2017

RDBMS to Big Data Hadoop Via Cloudera




Consumer Electronics Database

A manufacture of consumer electronics (local - so safe from Donald) was concerned about his supply chain. Why? As with most supply chain systems, there are multiple factors to risk. Geo-political risk is usually one of the factors - to track risk to supply chain from other regions (riots, couplet,Coup d'état, etc).   But now with the new White House administration, who would have thought that a "far away" geo-political risk factor originates from us - Donald potential blocking heavily taxing parts imports. The company needed to track all parts, its suppliers, and the region of the suppliers.


Graduate To  RDBMS From Excel

They had a list of their suppliers stored in an Excel spread sheet - first on a local disk drive, then later "upgraded" to cloud (Google & Microsoft 365). But soon the need to write extensive queries outweighed what pivot tables, sorts, filters, and macros can do.  The decision was made to implement the XLS into a RDBMS.  The first shot was done using Oracle MySQL  Workbench CE running on Windows 10. The data cleaning and ingestion was done via Python (another topic). Once ingested into RDBMS, standard MySQL queries can be used. Here is an example:


An supplier database on a RDBMS running on W10


Big Data = Volume. Velocity. Variety.

The CEO was happy - he finally graduated from a spreadsheet to  RDBMS.  But he also wanted to deploy it as "Big Data". The list of supplier (and parts) will grow as we start to migrate other product lines into the RDBMS. He also wanted to use all the goodies of Big Data - deployment in the cloud, advanced analytics, different data types (like pictures). Volume. Velocity. Variety.


Cloudera - First Dip Into "Big Data" Apache Hadoop

So using the same trusty W10 machine, I decided to prototype a Big Data for the supplier RDBMS. Cloudera offers a VM (in multiple flavors) and Container as a prototyping vehicle. So after installing Oracle VirtualBox  (I have better luck with it over VMWare), my own Cloudera was running.

Cloudera VM (Guest OS'd on CentOS), Running on W10 VirtualBox



Cloudera Allows Steps Of Migration 

The neat thing about the Cloudera setup is that RDBMS is already setup. So you can check out your RDBMS in the Cloudera VM before you migrate to Hadoop.

MySQL In The Cloudera VM

Apache Sqoop : RDBMS -> Hadoop

Once I had confidence that the RDBMS setup was good in the Cloudera VM, I started the unknown path of converting it to Hadoop. Apache Sqoop supports this endeavor - but Cloudera made it super easy.

A little CShell Script To Convert RDBMS Into Hadoop


Once the script is launched, the map reduce takes over for hours. You can look at the progress using a web browser.

Using A Web Browser To Check On Apache Sqoop


In Big Data Land!

After the process, we can now use Cloudera Hue (its version of Apache Hive) to reuse many of the MySQL queries.

The Supplier Database, Originally In RDBMS, Is Now In "Big Data"!


Conclusion:

The steps from RDBMS to Hadoop is manageable if taken in baby steps. Cloudera's environment makes that easy to do. For my next task, I will create clusters.


Monday, January 23, 2017

Machine Learning - For Supply Chain ERP

Machine learning has seen a rise in popularity - from popular fiction to autonomous driving to bot chat. Does it have any applicability to boring Enterprise Resource Planning - especially for Supply Chain?

The answer is a boring "yes, of course". Supply Chain is the art and science of making sure factories are humming 24/7 and  that logistics is delivering optimally and predictably.

If you examined most of the Supply Chain ERP modules from Oracle/Netsuite, SAP, Infor, etc, they basically have broken down their solutions into four silos : Suppliers, Logistics, Factory, and Warehousing.

How can machine learning be applied to Supply Chain ERP? Here is a glimpse of the now (descriptive) and the future (prescriptive)

Descriptive explains WHAT HAPPENED. Predictive provides advance data on WHAT MIGHT HAPPEN. Prescriptive offers WHAT SHOULD YOU DO.