Attending EMSDC ROAR (Return On All Relationships) Event

RETURN ON ALL RELATIONSHIPS

On September 17th, we expect a full day of B2B matchmakers, corporate roundtables, dynamic speakers, targeted workshops, AND new this year, “Ideation Sessions” during this wonderful event presented by EMSDC. They’ve also added a pre-conference workshop on September 16th for diverse businesses to work on their pitches followed by the Welcome Reception at Landshark Bar and Grill on the beach.

This is one event you surely don’t want to miss. We look forward to seeing you there. Click here to register.

Digital Transformation : Conagra Case Study

CIO Helps Conagra Turn Food Trends Into Products

A case for how  Conagra’s CIO, Mindy Simon, leveraged the AI platform to use technology to help drive growth, including the launches of new products under the company’s existing brand names. The platform sources data from the likes of Facebook, Google, Instagram, Pinterest as well as from various Market Research companies, with frequent updates. Pulling Data is automated, as is prepping it for analytics applications.

The tool lets the business identify “pockets of growth” that might have otherwise gone unnoticed. This is largely because the platform can tie data together in one place.

 

Conagra AI application ( 2 pages / 850KB)

 



Copyright (c) 2019 Dow Jones and Company, Inc. CIO Journal  07/22/2019

Data Virtualization: The Modern Data Integration Solution

Data virtualization is a modern data integration approach that is already meeting today’s data integration challenges, providing the foundation for data integration in the future. This paper covers the fundamental challenge, explains why traditional solutions fall short, and introduces data virtualization as the core solution.

Click Here to read more (PDF, 11 Pages, 1.84MB)

Business Intelligence in the age of Analytics and Artificial Intelligence

ADVANCE TO INTELLIGENT DECISIONMAKING

Jonathan Bach & Arvind Handuu

LET US START WITH GETTING THE BASICS OUT OF THE WAY

– DATA IS NOT THE NEW OIL, IMPORTANT, YES BUT BUSINESS IS ABOUT DOING BASICS RIGHT,

– ALL BUSINESSES ARE NOT INFORMATION BUSINESSES,

– ARTIFICIAL INTELLIGENCE IS NOT INTELLIGENCE, JUST BEST GUESS

– DATA BY ITSELF AND DEVOID OF THE CONTEXT IS WORTHLESS

These days, one cannot pick any trade publication and not notice a convergence on dia del sabor, namely Analytics and its ‘on-steroids version’ Artificial Intelligence (AI). The collective wisdom points to the efficacy of data utilization to transform the business. Many of our clients have gone through this stage of data envy. Some continue to allocate disproportionate scarce resource of time and treasure to “monetize their data”.

Like all gold rush stories, this one is fraught with peril. The primary reason appears to be the significance attached to the technology as opposed to the business objectives. We’ve seen this before with corporations jumping from one tool to another in hopes that the next tool will be the one to save the initiative, company or the project. The problem is in training, or lack thereof, of the training in the organizations to ask the right question. In this, the question becomes the answer.

The goal, it appears, is to replace or at least augment an organizations intuition and experience-based decision making with an element of data centricity to avoid missteps and to shine a spotlight on certain blind spots. The data centricity assumes a level of Data availability and education or without this any further progress is impossible. We propose a following step approach to help clients adequately optimize the operations and leverage data assets where possible and necessary.

Automate and Optimize – most businesses have implemented some level of automation, some more than others, the starting point is that. Before one begins any large-scale transformation, challenge what is already implemented. Even if it is nothing more than a general ledger at this time. This is a foundational step to transforming a company. The key watchword in this stage is flexible and adaptable. Know that what we build here will need to be changed and rebuilt. The automation platform should be flexible and adaptable.

Digital from the outset – This is an appropriate analytical step in the organization’s life cycle, as the model should be to think innovatively about key aspects of the business. The approach here is to develop transformative competencies that have a potential to be disruptive, think Uber to Taxis, Airbnb to hotels and also operations transparency. In this stage the company starts to build data competencies.

Value creation – with the core of data collection in place, the organization may launch the build stage of analytics framework. The organization decides to make key data available to executives for informing tactical decisions and strategic choices. The key success factors in this process are:

Analytics Evangelist – an organization needs to allocate a role of developing data insights and educating line managers on the possibilities of data analysis

Speed to deliver – Time kills energy, this is true in physics as it is in companies. The organization should have the necessary strategy in place to make the results of analytics available to the managers within on a few days of demand

Keep it simple – an organization must simplify the learning and deployment of analytics by reducing rework. Its often helpful to choose technology that offer a consolidated BI and analytics approach.

Adopt and adapt – Reduce human variability – The weakness of the data driven decision-making is often in ignoring the human factors in adoption:

Education- Implement programs to drive overall data competency in an organization

Build Momentum through ease of access, make it easier for executives to get analytical answers

Implement and measure the use of analytical elements in the corporate choice selection

Reward the utilization of analytical tools

New applications – Encourage the use of analysis, by promoting new applications and new analysis. Share and promote.

Critique – Set periodic review points to analyze the decisions made and supported by hard measured data as well as decisions made without the adequate data. Compare outcomes.

Educate and Evolve – Business Intelligence and Analytics is a high ROI and can be implemented very cost effectively. The organization needs to ensure a steady process of training the users about new data and analysis being made available and effective strategies for utilization

In the end, Data can be a very effective tool available to the decisionmakers in an organization, the power of data becomes available to the organization as more investments are made in development of human capital to ask the right questions of the business. Features in modern BI and Analytical Tools can help users overcome delays and difficulties by automating aspects of data exploration and analytics development and delivering information answers and recommendations to users in the context in which they need it.


Jonathan Bach is a Client Solutions Partner at Visvero, Inc. He works closely with various F2000 clients in optimizing the use of professional services for “ADVANCE”; ing along the BI/ Analytics path. Jon is based out of the Visvero, Pittsburgh.

 

Arvind Handuu is a Practice Manager for Business Intelligence & Analytics at Visvero. Arvind is an analytics value purist. He believes that a BI & Analytics platform should be a self-contained and sovereign solution. “Its value drops to zero the instant you are using a different data source to inform your decision.”; Arvind is based out of Visvero, Pittsburgh.

Data Virtualization

DATA VIRTUALIZATION – WHAT’S REAL?

Meenakshinathan Padmanabhan & Arvind Handuu

Unless you’ve been living under a rock for the past few years OR have
chosen to not look at a printed word, you’d have been told, often enough
that you are physically fatigued, data is what you produce and how that
(data) is the new oil. The real cure for all that ails us. How, once we
organize it and make it get more well-rounded experience and learning, the
world would be a better place. My cat is now afraid of Data, it’s data
world and not the cat’s, we all including the cat just rent it.

To be fair though, the impact of Data – the availability, regeneration,
amount, age, lineage, utility, diversity, value, reach- though at times
overstated is significant enough that organizations are well served by
continually scanning the environment to look for opportunities that make
value creation possible.

In a recent Forbes Survey’ 2018 Data Virtualization was in the top 3
highest growth areas for the year 2018 / 2019.

This post is an attempt to simplify the conceptual presentation on Data
Virtualization. So then let’s take it from the top.


So, what is Data Virtualization? what is a good use case for this? and
what are the corner cases where this approach fails?

Data Management, especially in the applications that require post-fact data
creation and analysis of institutional data assets, is an ever-moving
target. It appears that just as a seemingly effective governance model is
implemented and initial set of questions are getting answered, new
questions arise often requiring new information and data sources to be
incorporated in the answer base. This requires reworking the data
warehouse, introducing the new data source, applying the same rigor to
ensure data hygiene. And an expensive build, add, analyze cycle repeats.
Data Virtualization offers a near-term reprieve from this solution by
making it easy to introduce new data sources rather quickly. This solution
has a potential of being THE solution in case of less complex data
environments and at least a resilient intermediate solution in applications
with higher data complexity.


Data virtualization is the process of offering data consumers a data
access interface that hides the technical aspects of stored data, such
as location, storage structure, API, access language, and storage
technology.

Data virtualization creates integrated views of data drawn from disparate
sources, locations, and formats, without replicating the data, and delivers
these views, in real time, to multiple applications and users. Data
virtualization is any approach to data management that allows an
application to retrieve and manipulate data without requiring technical
details about the data, such as how it is formatted at source, or where it
is physically located, and can provide a single customer view (or single
view of any other entity) of the overall data. Data virtualization can draw
from a wide variety of structured, semi-structured, and unstructured
sources, and can deliver to a wide variety of consumers. Because no
replication is involved, the data virtualization layer contains no source
data; it contains only the metadata required to access each of the
applicable sources, as well as any global instructions that the
organization may want to implement, such as security or governance
controls. This concept and software is a subset of data integration and is
commonly used within business intelligence, service-oriented architecture
data services, cloud computing, enterprise search, and master data
management.

The concept was initially incorporated in various business intelligence
tools like @Qlik, @Spotfire, @Tableau to name a few. The obvious limitation
being the close coupling between the virtual data store and the choice of
analytical (at the time this was mainly data visualization) tools. That
meant that the limitations of the analytical tools defined the extent to
which data could be utilized. The below graphic represents the data
virtualization approach by one of the leading solutions vendors in this
technology, Denodo.

Image Courtesy: Denodo

Our teams have taken a position that in case of very small data base
volumes and relatively clean data sources, data virtualization would be an
effective solution that would allow a federated data structure and quick
analytics solution. However, as the data complexity increases the
organizations will need a more disciplined data governance practices
effected in the data warehouse led analytics platform. In such cases a
virtualized database solution would be utilized as a rapid Proof of Concept
solution to test various source systems.

We find data virtualization highly effective in the following use cases:

‣ Generally structured data sources with easy to define relationships.
Referring to the promise stated earlier in this article, Data
Virtualization really does deliver on the data integration front. Whether
one needs data from a mobile application or from hundreds of domains and
other web technologies, Data Virtualization consolidates all of that into a
single solution.

‣ Data virtualization supports the integration of structured and
semi-structured data, and is seamlessly supported by the likes of Hadoop
and MapReduce.

‣ Rapid analytics delivery OR short-term proof of concept solutions. Unlike
some massive Data Management solutions, Data Virtualization can be
implemented at an unnervingly rapid rate. It can be implemented into
already existing infrastructure in a matter of weeks and months. Some Data
Virtualization adopters have reported an ROI turnaround of less than six
months.

‣ Direct exposure into the source applications, the reason for data
virtualization is the ability in incorporate operations data in real time.

While the above might appear compelling, data virtualization falls short in
the following key application areas:

‣ Historical and lineage tracking applications e.g. Slowly Changing
Dimension Type I/ Type II problem areas. Organizations need to use data
warehouses when there exists a need to analyze data that is days, weeks or
even months old. Data warehouses are a better option for an organization in
this case.

‣ Data Virtualization often imposes a great deal of stress on the
organization’s operations, often requiring massive overhead. These changes
need to be integrated and distributed throughout every user and application
within your entire infrastructure. This can be a huge financial and
logistical strain on your environment.

‣ Overall effectiveness, data virtualization solutions can be deceptively
difficult. The data virtualization solutions’ effectiveness in managing
real-time data delivery can be a little underwhelming. The expectation gap
usually occurs when an organization thinks that just because they’re using
a powerful Data Virtualization solution that they no longer have to manage
their own data.

In the Data Management space there are very few, if any, magic bullets.
Data Virtualization is an effective Swiss Army knife in a data architect /
Solution strategist’s toolkit. While data virtualization is far from
perfect now the overall market is evolving at a rapid rate to provide
access to real-time, easily managed data. But as a sole mode of capturing,
interpreting, and managing BI data, the virtualized data warehouse is an
effective strategy to create business value and introduce additional data
sources in the analytics framework.



Meenakshinathan (Nathan) Padmanabhan is a Sr. Data Solutions
Architect at Visvero, Inc. He has been supporting various F2000
clients in deploying effective data management, Business
Intelligence and Analytics solutions for over 20 years. Nathan is
based out of the Visvero, Pittsburgh.

 


Arvind Handuu is a Practice Manager for Business Intelligence &
Analytics at Visvero. Arvind is an analytics value purist. He
believes that a BI & Analytics platform should be a
self-contained and sovereign solution. “Its value drops to zero the
instant you are using a different data source to inform your
decision.” Arvind is based out of Visvero, Pittsburgh.

TABLEAU 8 IS HERE: A QUICK REVIEW OF MY FAVORITE FEATURES.

When you say Business Intelligence it often conjures up different visions. For example, when I tell people I work in BI, I get reactions ranging from blank stares to “Oh…so computer stuff,” to my personal favorite, my barber asking if that meant corporate espionage. While I do like to entertain the image of myself as a slightly geeky (and perhaps less fit) James Bond, it probably isn’t what most people think of when they think of working in Business Intelligence.

My bet would be that the image in most people’s minds is probably a visual of an analytic dashboard. Dashboards, after all, are the face of Business Intelligence. The software that enables us to convert raw data into dashboards is crucial to enabling business users to consume information in a user-friendly form and then take that information and make informed business decisions. It’s all about taking data and telling the story in a meaningful, actionable way. With the release of Tableau 8, one of the leading BI vendors is upping their game.

Tableau Software is rated as one of the leaders in BI vendors according to Gartner’s Magic Quadrant for Business Intelligence and Analytics Platforms and this past March they celebrated the release of their best version yet. I’ve had exposure to lot of different BI visualization tools over the years and Tableau is definitely one of my favorites. The tools they provide are easy to use, intuitive, and powerful. This allows for rapid development and deployment of interactive dashboards. These dashboards give the data context and help make it meaningful to the end user.

While Tableau 8 is jam-packed with new features, I wanted to concentrate on just a few of my favorites.

Forecasting

Visvero is consistently reminding our clients that business intelligence is all about finding the stories hidden with the data. All data has a story to tell, but without help it can be difficult to understand what that story is. Giving the proper context to raw numbers allows us to look at historical data and understand the story so far. However, like that ever frustrating cliffhanger ending or “to be continued…” notation after an enthralling TV show, we are always nagged by the curiosity of what comes next. What is the next chapter in our story?

In the business world this isn’t just a nagging curiosity. Many times it comes down to the difference between success and failure. Without the ability to intelligently look to the future and anticipate the demands of tomorrow, you are endangering the future of your business. It’s no wonder then that being able to forecast data (predictive analytics) is one of the most popular demands for BI solutions. In Tableau 8 it is about as easy as it gets. Simply, with a few mouse clicks, you can add a forecast in seconds, and easily tweak your forecasting model to suit your needs.

New Visualizations

Of course, one of the biggest parts of any business intelligence tool is what visualizations it can create. In Tableau 8, the visualizations are diverse and very informational. New options like word clouds and bubble maps are great additions. However, my personal favorite is the tree-map. Not only are tree-maps great chart types for giving context, the way Tableau handles tree-maps allows for bar charts of tree-maps. This combination can be a powerful way to show relative proportions across categories in a beautifully simplistic way. It’s easily, in my opinion, one of the most powerful of the new view types.

In addition, Tableau 8 now supports overlapping objects on your dashboard. Anyone who has spent a significant amount of time with visualization tools can tell you, arranging the design and layout of a dashboard can often be a very tedious and frustrating task; especially when trying to make the most of your screen real estate. Tableau 8 makes laying out the dashboard nearly hassle-free by enabling you to overlap visualizations. This empowers you to make more efficient use of screen space and easily mold your layout to your needs.

Visual Grouping / Set Improvements

The concept of groups and sets are nothing new to Tableau, but with version 8 they’ve been shined up and improved. Grouping can now be done visually and on the fly as you select objects in your view. Those groups are quickly color-coded to “paint” pre-defined selections to help guide the user. The sets functionality has also been streamlined and improved to allow for more advanced set comparisons. This gives you more analytic power than ever before. Pairing dynamic calculated sets with painting using visual grouping makes Tableau 8 even more powerful as an in-depth analysis tool.

Subscriptions

Building the best dashboard in the world isn’t going to be of any use if the users who need them aren’t even looking at them. We all know that often management is busy in meetings and completely forget to review their reports before heading to the next one. That’s where the new subscription feature in Tableau 8 comes to the rescue. This feature allows users to subscribe to one or more worksheets and get scheduled emails with images of your selected worksheets along with links to the live reports at your fingertips.

Tableau is always pushing the limits, and their new version is certainly packed with features to be explored and used. Check out their website for a more in depth look at the new tool, and try to catch the Tableau team on tour to see more of what Tableau has to offer! These are just a few of the new features in Tableau 8, but they are some of the features I am personally excited to see and use in my own projects. What features are you excited to try out in the new release? Let me know below in the comments.

PARTNER – RECRUITER | PREDICTING IT RESOURCES FOR NEXT WEEK…

In the ever changing world of recruiting and sales in the IT world, I sometimes find myself wondering what technologies are next to take off. Every day, it appears anymore, some new technology catches the fancy. Newer methods, newly packaged tools all appear to be going up and down the hype-cycle at a rapid pace. Last it it seemed like you could not crack open a journal without seeing Big Data, the year before Cloud, and before that Self Service and Apps and Monetization and on and on. ( … and more is coming, take a look at the Gartner Hype Cycle Chart of 2016).

I hear a lot about Artificial Intelligence (AI) these days( AI, along with smart data discovery, IoT, smart robots, cognitive expert advisors) is rapidly edgding towards the peak of the cycle, before it becomes truly productive. This appears to be the new area of huge growth and with a relative lack of qualified candidates. Being that very few schools here and overseas teach this as a discipline the lack of an adequately qualified resource pool is particularly troubling.

Being in this business for years you may ask me; “How do we find these types of candidates when they are either employed or don’t exist?” The answer, I’m afraid is…you don’t. What you do instead is to create talent much the same way a great coach creates an outstanding football or a basketball program. You find people who can, not just those who already have, and you nurture and adjust and develop.

In this world what we tend to do is find candidates that have certain engineering, mathematics and software experience mixed with a knowledge of software development and IT architecture. We work with the clients to develop a program,educate both player ( candidate) and environment (company) and help select a candidate that has certain skill sets that help compete the team even if individuals lack complementary areas of importance. Truly, it comes down to how well you know and understand your team and what is needed to balance it out. The more you know and understand about the team need, the better your results will be.

I get this question a lot and that is; “How are you able to receive information from your clients on what they truly need?” The answer is and always will be…your knowledge, flexibility,partnership and your relationship. Even still clients will only share information with you if you have benefited them in the past. It takes time, dedication, investment and tenacity to ultimately educate yourself what are the benefits to the client and not to you.

Predicting what next week looks like should always be something you do as you gain your experience, knowledge and develop your business.