Data to Decisions

Automation, Integration, Analytics, Reports & Dashboards

An End-to-End Data Analytics Walkthrough

Overview This blog post provides a worked example of an end-to-end automated business intelligence s

Overview

This blog post provides a worked example of an end-to-end automated business intelligence solution. This blog post demonstrates how to load data from different sources, join the data together, cleanse and correct data quality issues, compute hypercubes, and design a dashboard. I then demonstrate how to deploy the solution to the autonomous agent framework for continuous execution.

An End-to-End Data Analytics Walkthrough

Flow Crash Course - Part 9 - Deployment to Autonomous Agents

This is the ninth blog post in our crash course series on Flow. In this blog post, I provide an intr

Overview

This is the ninth blog post in our crash course series on Flow. In this blog post, I provide an introduction to deploying the solution developed in the previous sections to the Flow Autonomous Agent Framework.

Flow Crash Course - Part 9 - Deployment to Autonomous Agents

Flow Crash Course - Part 8 - Data Visualization / HyperCube Visualization / Multi-dimensional Tables

This is the eighth blog post in our crash course series on Flow. In this blog post, I provide an int

Overview

This is the eighth blog post in our crash course series on Flow. In this blog post, I provide an introduction to HyperCube Visualizations and Multi-dimensional HyperCube Tables in the Flow Computing Framework.

Flow Crash Course - Part 8 - Data Visualizations and HyperCube Multi-dimensional Results

Flow Crash Course - Part 7 - Introduction to Results / HyperCube Dashboard Development

This is the seventh blog post in our crash course series on Flow. In this blog post, I provide an in

Overview

This is the seventh blog post in our crash course series on Flow. In this blog post, I provide an introduction to Results, Dashboard Design, and a first look at HyperCube Reporting in the Flow Computing Framework.

Flow Crash Course - Part 7 - Introduction to Results / HyperCube Dashboard Development

How to Import and Analyze Common Types of File Data Sources

This blog post provides a worked example of how to import and analyze Microsoft Access Data. We lear

Overview

This blog post provides a worked example of how to import and analyze Microsoft Access Data. We learn how to use the Access Database integration interface to consume the sample Northwind database into Flow. A step-by-step walkthrough is provided which details how to denormalize the various relational tables into a consolidated flattened set for analysis. We learn how to apply generic expressions to compute new data points on the fly. Finally, we learn how to leverage Flow's multidimensional analysis engine to compute hypercubes and summarize the data.

How to Import and Analyze Common Types of File Data Sources

 

 

How to Denormalize (Join) Data Using Flow Analytics

Overview This blog post demonstrates how to configure the denormalize function in order to join disc

Overview

This blog post demonstrates how to configure the denormalize function in order to join disconnected data sets together. A worked example is provided which shows how to import and merge various delimited files. The denormalize action is used to join the data from the separate files together in order to consolidate them into a single set for analysis. Once the data is joined, we learn how to use hypercubes to aggregate and summarize the data.

How to Denormalize (Join) Data Using Flow Analytics

 

 

 

Building Grouped Reports with Flow Analytics

Here is the second in a series of posts focusing on building reports in Flow. A grouped report is an

Overview

Here is the second in a series of posts focusing on building reports in Flow. A grouped report is an advanced report produced by Flow. Grouped Reports organize records into one or more nested groups where each group is is a collection of records with a common column data value. There are two basic methods you can employ to create grouped reports in Flow. The first is to add a Grouped Report action to a new or existing workflow. The second way is to open a hypercube within the Flow portal then click on the report icon Create Report button in the toolbar located at the top of the hypercube view. This post will cover the first method.

Building Grouped Reports with Flow Analytics

 

 

Use Flow Analytics + Artificial Intelligence to Analyze the News

Overview In this blog post, I provide a worked example demonstrating how to design a workflow which

Overview

In this blog post, I provide a worked example demonstrating how to design a workflow which extracts and analyzes cryptocurrency news articles using artificial intelligence. I explain how to use the HTML integration interface to extract links for all top news stories from a target website into data. I show how to use generic expressions to transform and clean the raw links, preparing them for processing. Flow is used to loop through each of the structured links and invoke the built-in Watson artificial intelligence functions to perform advanced cognitive analytics against the text of each news article. Flow collects the results of the cognitive analysis and compiles an aggregate dataset of sentiments, emotions, concepts, topics, keywords, and named entities for all of the supplied articles. I finish the example by showing how to compute hypercubes against the cognitive output to summarize the results and generate various multidimensional views.

How to Use Flow + Artificial Intelligence to Analyze the News

 

 

 

Doing Data Quality with Flow Analytics

In this article, I provide an introduction to measuring and evaluating data quality using Flow. I br

Overview

In this article, I provide an introduction to measuring and evaluating data quality using Flow. I briefly discuss data quality dimensions and data quality assessment. Then I examine how a schema-on-write approach increases the time and cost required to assess data quality along with a brief discussion of schema-on-read technology. I then introduce Flow's "Generic Data" technology as a solution to the deficiencies of schema-on-write and schema-on-read for data quality. Finally, I provide a hands-on working example of doing data quality in Flow Analytics using some sample name and address data. 

Doing Data Quality with Flow Analytics

 

 

An Introduction to Building Dashboards in Flow Analytics

Overview Flow enables you to build dashboards containing a variety of elements including tables, cha

Overview

Flow enables you to build dashboards containing a variety of elements including tables, charts, reports, and data summaries, among others. This post focuses on two methods you can use to build, populate, and update dashboards. I show how to add a new dashboard, then how to create and add chart result using one of the sample datasets provided. Next, I provide an in-depth discussion of adding workflow generated results to a dashboard.

An Introduction to Building Dashboards in Flow Analytics