Webinar: Take Your Enterprise Analytics to the Next Level with Native BI Platforms for Data Lakes

Aired: April 19, 2018

Many large modern enterprises are data-aware – they deploy processes to transform raw data into information using a variety of data integration, data management, and business intelligence (BI) tools. But being data-aware, or even data-driven, does not necessarily mean being insights-driven.

Are your BI applications providing valuable insights? Are these insights prescriptive and actionable? Are these actions driving tangible business outcomes? In this webcast you will learn what it takes to move your BI environments to the next level by harnessing the power of a data lake to drive new insights and business agility.

Join our webinar where our featured speakers Forrester Vice President and Principal Analyst, Boris Evelson, Alex Gutow from Cloudera, and Steve Wooledge from Arcadia Data will discuss:

  • Benefits and challenges of becoming an insights-driven business
  • Benefits of bringing BI to data (vs. bringing data to BI)
  • Evolution and best practices for modernizing BI through data lakes
  • Getting the full value of your data with agile BI
  • Real world customer successes

Find out how you can drive new insights now:

Take Your Enterprise Analytics to the Next Level With Native BI Platforms for Data Lakes

Presenters

Boris Evelson

Vice President and Principal Analyst

Steve Wooledge

Chief Marketing Officer

Alex Gutow

Senior Product Marketing Manager

Transcript

[00:00:04.099]
welcome to today's webcast brought

[00:00:06.200]
you by Arcadia data I'm Steven

[00:00:08.599]
fake directory database Trends in applications

[00:00:10.699]
and unisphere search I will be your

[00:00:12.699]
host for today's broadcast our

[00:00:15.400]
presentation today is titled take your Enterprise analytics

[00:00:17.699]
to the next level with Native bi platforms

[00:00:20.199]
for data Lakes before we

[00:00:22.300]
begin I want it's when they can be a part of this broadcast

[00:00:24.399]
there will be a question-and-answer session

[00:00:26.600]
if you have a question during the

[00:00:28.600]
presentation just type it into the question box

[00:00:30.600]
provided and click on the submit button

[00:00:33.299]
will try to get as many questions as possible but

[00:00:35.700]
if your question has not been selected during the show

[00:00:37.799]
you will receive an email response

[00:00:40.799]
plus one look if you work today will win $100

[00:00:43.299]
American Express gift card the

[00:00:45.399]
winner will be announced at the end of the event so stay

[00:00:47.799]
tuned to see if it's you

[00:00:51.200]
now can you do star speakers for

[00:00:53.399]
today Boris evelson

[00:00:55.600]
vice president and principal Analyst at

[00:00:57.700]
Forrester Consulting Alex

[00:01:00.200]
buto senior product marketing manager at

[00:01:02.399]
Cloudera and Steve

[00:01:04.700]
will advise president of marketing at

[00:01:07.000]
Arcadia data for

[00:01:09.099]
more information on our speakers today you can click

[00:01:11.200]
on the Arrow under their head shots on the

[00:01:13.200]
console now I'm going to pass the event

[00:01:15.400]
over to Boris

[00:01:17.900]
excellent good afternoon

[00:01:20.700]
good morning good evening everyone

[00:01:22.700]
I see people dialing in from multiple

[00:01:25.099]
time zone so thank you for taking

[00:01:27.200]
the time out of your busy days

[00:01:29.200]
to go to listen to

[00:01:31.200]
this presentation the

[00:01:33.400]
main topic of the presentation is transforming

[00:01:36.299]
your organization's from data-driven

[00:01:38.700]
to inside driven what

[00:01:40.799]
does that really mean it seems that most

[00:01:43.299]
of the large Enterprises today

[00:01:45.900]
have some kind of a data management

[00:01:48.099]
strategy architecture

[00:01:50.200]
platform plan the

[00:01:52.900]
date of the integrated

[00:01:55.200]
they are they model the data they collect

[00:01:59.200]
the data and analyze the data with

[00:02:01.400]
all sorts of Bri and the voting

[00:02:04.400]
and data visualization plan and

[00:02:06.799]
solution so they are getting lots

[00:02:08.800]
and lots of signals from the from

[00:02:10.900]
the data but are they getting

[00:02:13.300]
actionable inside of

[00:02:15.500]
these inside being transformed

[00:02:17.900]
2 things about business outcomes

[00:02:20.000]
that's really the next level of

[00:02:22.099]
maturity in what we

[00:02:24.099]
today call inside driven

[00:02:26.099]
by businesses without a mind

[00:02:28.400]
just too clearly differentiate between

[00:02:30.599]
all the generation just simply

[00:02:33.000]
data-driven capabilities and

[00:02:35.300]
next-generation what we now

[00:02:37.400]
call inside driven capabilities

[00:02:39.599]
with forest and I would like to use

[00:02:41.699]
the term systems of inside why

[00:02:44.599]
did we choose that turn while because systems

[00:02:47.199]
of record system or automation

[00:02:49.699]
are all the terms familiar

[00:02:51.699]
to you too we filed that system

[00:02:53.900]
inside was was a great term to

[00:02:56.199]
talk about inside driven

[00:02:58.199]
businesses the reason

[00:03:00.199]
that this is so infinitely

[00:03:02.199]
important is this fine

[00:03:04.500]
in the next couple of flights on

[00:03:06.500]
what is being predicted out

[00:03:08.599]
there is that for the next few years

[00:03:12.000]
inside drilling companies are

[00:03:14.300]
going to outpace competition.

[00:03:16.400]
Going to outpace competition

[00:03:18.500]
by 8 to 10 times now

[00:03:20.699]
that's not a to 10%

[00:03:23.599]
for 8 to 10 times faster

[00:03:26.900]
than competition and what that

[00:03:29.000]
really means is that those

[00:03:31.099]
companies that are going to be inside

[00:03:33.599]
driven award that are inside

[00:03:36.000]
driven today I going to take about

[00:03:38.000]
1.8 trillion away

[00:03:40.599]
from their known inside

[00:03:42.800]
driven competitors that's a huge

[00:03:45.000]
chunk of change and today and

[00:03:47.300]
for the last couple of years was absolutely I've

[00:03:49.300]
been seeing a

[00:03:51.400]
lot of proof points a lot of our

[00:03:53.500]
clients that I have already been reporting

[00:03:55.699]
benefits from their bi

[00:03:57.800]
Investments be you ingest

[00:03:59.800]
intangible qualitative

[00:04:01.800]
benefits

[00:04:04.500]
such as better decision-making

[00:04:06.599]
or better transparency

[00:04:11.900]
that they are increasing marginal

[00:04:15.300]
profitability they are increasing their

[00:04:17.300]
sales or or

[00:04:19.600]
some other top or bottom line

[00:04:22.199]
improvements based on behind and

[00:04:24.699]
your income statement

[00:04:27.199]
benefits are not the only ones playing

[00:04:29.500]
a significant role here balance

[00:04:32.000]
sheet in terms of improving

[00:04:34.399]
your work capsule utilization

[00:04:36.399]
improving your information

[00:04:38.399]
Back Bay status and reducing

[00:04:40.399]
inventory based on

[00:04:42.800]
working capital optimization are

[00:04:45.199]
some of the other benefits are the benefits

[00:04:47.500]
are definitely all over the tangible

[00:04:49.899]
benefits of the term

[00:04:52.500]
tangible for the rest of this presentation

[00:04:54.800]
are all over the place but but unfortunately

[00:04:58.300]
this is not as easy as

[00:05:00.699]
it seems and I'm sure most of you

[00:05:02.699]
on the phone have lift through something

[00:05:04.899]
like this was received on the slide

[00:05:07.300]
dye yes a lot of outside drowning

[00:05:09.300]
and data and still

[00:05:11.699]
alive what are starving

[00:05:13.800]
for youth size we've been hearing this for the

[00:05:15.800]
last 10 years so what is it

[00:05:17.800]
that has really changed at that

[00:05:19.800]
should be questioned and that's precisely what

[00:05:22.000]
we will address in the rest of

[00:05:24.100]
this presentation so lots and lots

[00:05:26.199]
of challenges are still out there just

[00:05:29.399]
under half of all clients are still

[00:05:31.500]
not realizing any kind

[00:05:33.500]
of quantitative a benefits

[00:05:35.600]
for my business intelligence and those

[00:05:37.800]
that do I

[00:05:40.100]
just found a half of them are taking more

[00:05:42.300]
than a year to realize those

[00:05:44.399]
benefits and that's not quick

[00:05:46.699]
enough we'll talk about that in

[00:05:49.000]
a couple of slides at one of the kid

[00:05:51.399]
challenges that we continue

[00:05:53.600]
to see in this market is

[00:05:55.800]
continuing disconnect between business

[00:05:58.199]
i l i t this is

[00:06:00.199]
not about who's right and who's wrong this is

[00:06:02.199]
just a realization that a

[00:06:04.399]
business and technology Professionals

[00:06:06.399]
for the right reasons I'm this

[00:06:08.500]
is not that criticizing anyone

[00:06:10.600]
for the right way they do have

[00:06:12.699]
some with conflicting priorities because

[00:06:15.399]
creating

[00:06:17.500]
a a single bi

[00:06:20.100]
platform in an Enterprise

[00:06:22.600]
creating streamlined architecture

[00:06:24.899]
supporting a bi centrally

[00:06:27.500]
and in hopes of obtaining

[00:06:29.800]
that single version of the truth is

[00:06:32.000]
not a trivial process

[00:06:34.300]
it's not a trivial Africa. Steak

[00:06:36.300]
time it did it it does carry course

[00:06:38.500]
and it's not something that is

[00:06:41.500]
plug-and-play and

[00:06:43.600]
therefore sometimes these things do

[00:06:45.699]
take time and therefore when

[00:06:47.899]
business is our business partners

[00:06:50.000]
our business counterpart just want to give

[00:06:52.000]
their jobs done quickly and efficiently

[00:06:54.000]
and effectively and

[00:06:56.000]
Eve AIT professionals

[00:06:59.000]
don't realize and don't understand

[00:07:01.300]
that getting the job

[00:07:03.300]
done get getting the business job done is

[00:07:05.500]
the first priority. That's

[00:07:07.800]
where we start getting this just going

[00:07:09.800]
to act very interesting

[00:07:11.899]
a couple of data points on this side of the next

[00:07:14.100]
slide on the one hand

[00:07:16.100]
and I'm sure you know or all

[00:07:18.100]
of you know this very well the amount

[00:07:20.300]
of data that we all

[00:07:22.300]
store and processes Annalise

[00:07:24.600]
is growing by Leaps and Bounds

[00:07:26.600]
so what this is telling you on the right

[00:07:28.600]
side of the screen is at the number

[00:07:30.600]
of companies with hundred terabytes

[00:07:32.600]
of data doubled last

[00:07:34.800]
year but the window on

[00:07:37.399]
that day that I really hasn't changed my

[00:07:39.699]
cousin matter of fact the self-reported

[00:07:41.899]
numbers of about 1/3

[00:07:44.199]
or for a data unstructured

[00:07:46.300]
data and the hopal structured

[00:07:48.600]
data being a used for

[00:07:50.899]
inside San analytics and

[00:07:53.300]
decision-making is not a very

[00:07:55.399]
realistic number because guess what you

[00:07:58.000]
don't know what you don't know the way our

[00:08:00.500]
clients tell us we we think

[00:08:03.000]
we use about 50% of our truck

[00:08:05.300]
today yeah you may be using 50%

[00:08:07.699]
of everything that you know but there

[00:08:09.699]
was launched out there in your transaction

[00:08:11.800]
operational a structured

[00:08:14.100]
and unstructured data sources internal

[00:08:16.300]
and external a yield at your

[00:08:18.300]
partner that you would

[00:08:20.600]
just not aware off

[00:08:22.800]
of also very interesting

[00:08:24.800]
to note that we we

[00:08:26.899]
tend to get higher numbers

[00:08:29.199]
for my tea and much

[00:08:31.300]
lower numbers and how much closer to the

[00:08:33.500]
right side of this picture for a

[00:08:35.500]
call from the business users and

[00:08:37.799]
I know this last night I promise

[00:08:39.799]
before we get to the good part of the presentation

[00:08:42.100]
and I'm sure this is near and

[00:08:44.100]
dear to everyone's heart so I'm

[00:08:46.899]
sure everyone on the phone still

[00:08:49.200]
experiences there's that majority of

[00:08:51.299]
you are bi analytical

[00:08:53.299]
insights applications on North

[00:08:55.399]
done in enterprise-grade bi

[00:08:57.700]
platforms that still the Madea

[00:09:01.100]
Shadow i t type or types

[00:09:03.200]
of applications how

[00:09:12.899]
can you close the disconnect

[00:09:15.399]
between business and I T how can

[00:09:17.500]
we address and analyze

[00:09:19.600]
most of our data that that

[00:09:21.600]
that we have and how to can we finally

[00:09:24.000]
start getting reads off a spreadsheet

[00:09:26.100]
the key

[00:09:28.399]
point to make here before

[00:09:31.100]
we get into any kind of discussion on

[00:09:33.100]
machine learning or artificial

[00:09:35.200]
intelligence of Big Data all

[00:09:37.299]
the terms that have been

[00:09:39.299]
used for

[00:09:42.100]
the last few years and I've become very popular very

[00:09:45.200]
few people talking

[00:09:48.399]
about the fact that today we are in the

[00:09:50.600]
age of a customer this is this

[00:09:52.600]
this concept is much more important than

[00:09:54.600]
big data on machine learning or anything else

[00:09:56.799]
what the age of a customer means

[00:09:59.000]
is that most of the Enterprises

[00:10:01.500]
have to run and operate for

[00:10:04.000]
what we call Outside Inn

[00:10:06.000]
in other words more than consumers

[00:10:08.799]
more than citizens if you are public

[00:10:10.799]
sector agency do not really

[00:10:13.000]
care Northern Illinois should they

[00:10:15.000]
care about how you run your internal

[00:10:17.700]
processes they don't really care about

[00:10:19.799]
your internal Finance

[00:10:21.899]
risk management supply chain or

[00:10:24.000]
any other processes they have

[00:10:26.000]
lots of options all the consumers are

[00:10:28.700]
empowered with mobile phones

[00:10:30.700]
and with Cloud access to all

[00:10:32.799]
of your competitors products and services

[00:10:35.200]
literally with a single click

[00:10:37.299]
of a button they can make a switch and

[00:10:39.700]
therefore unless you as a

[00:10:41.700]
business you as an organization or

[00:10:43.899]
prepare to do everything that's

[00:10:46.000]
in your power to follow the customers

[00:10:48.600]
you're going to fall behind and

[00:10:51.000]
you are not going to take

[00:10:53.000]
advantage of the of the modern

[00:10:55.100]
Global customer

[00:10:57.799]
driven the condom yourself I just

[00:10:59.899]
introduced this term Edge agility

[00:11:02.500]
being Edge and being flexible

[00:11:04.700]
and responses to customer

[00:11:07.000]
needs is really the key to success

[00:11:11.799]
by Big Data by

[00:11:13.799]
Machine learning or other

[00:11:15.799]
Technologies oldest very important point

[00:11:17.799]
between YouTube Understanding the

[00:11:19.799]
brace at this is really the

[00:11:21.799]
key business capability that's going to

[00:11:23.899]
allow you to to be successful and

[00:11:26.799]
when your customers

[00:11:28.799]
away from competition so with

[00:11:30.799]
this in mind real I hadn't realized it's a few

[00:11:32.899]
years ago with forces created this profile

[00:11:36.399]
how we measure business for Julius and

[00:11:38.399]
hopefully all the attributes of

[00:11:40.700]
business agility that you see here on

[00:11:42.700]
the right are self-explanatory obviously

[00:11:45.799]
if your channels are integrated

[00:11:48.500]
if you are more agile if

[00:11:51.000]
you can handle change management

[00:11:53.399]
in an efficient

[00:11:55.500]
manner you're more Edge out if your

[00:11:57.600]
infrastructure is elastic and

[00:11:59.899]
can grow and Shrink depending

[00:12:02.799]
on the customer demand and customer

[00:12:05.000]
requirements you're going to be on

[00:12:07.000]
edge and I'm sure you're probably suspecting

[00:12:10.100]
what I'm going to throw you on this ridiculous

[00:12:12.500]
fight yes indeed there is

[00:12:14.500]
a direct correlation if we found a couple of

[00:12:16.799]
years ago and we still going to still track that

[00:12:18.899]
correlation between higher

[00:12:20.899]
performers on the right side of this picture

[00:12:23.200]
remember High performers at

[00:12:25.200]
those companies are there I demonstrated

[00:12:27.799]
a few slides back that grow faster

[00:12:29.899]
than competition that don't grow

[00:12:31.899]
faster than industry averages

[00:12:34.100]
and as you can see they are all

[00:12:36.100]
over what we called formidable

[00:12:38.399]
category yes they are aware

[00:12:40.399]
that this is agility is

[00:12:42.500]
a key capability and

[00:12:44.899]
executing and they are

[00:12:46.899]
executed well that's versus on the left

[00:12:49.100]
side of this picture lower performers

[00:12:51.600]
those that pulled behind industry

[00:12:54.100]
averages those who fall behind

[00:12:56.100]
competition are all over

[00:12:58.299]
what we call the coolest Gorey they

[00:13:00.299]
don't know that this is important that

[00:13:02.299]
are not executing well or they

[00:13:04.399]
are aware but not executing

[00:13:06.600]
and therefore we called him paralyzed or

[00:13:08.700]
they are doing something about it without

[00:13:10.899]
really understand what they're doing and we call

[00:13:13.000]
them a dangerous sub old to see

[00:13:15.100]
you want to be in the formidable category

[00:13:17.600]
so that that's

[00:13:19.799]
the point that you wanted to make your now what

[00:13:23.799]
is it that we i t

[00:13:25.799]
of the Big Data create

[00:13:37.399]
and deploy edge of business model

[00:13:39.600]
12 we can practice Edge

[00:13:41.700]
big data

[00:13:44.000]
analytics interchangeably

[00:13:49.700]
as you can see this is not

[00:13:51.799]
just about as your software

[00:13:53.899]
development yes absolutely.

[00:13:57.399]
Concepts rapid prototypes

[00:13:59.500]
extremely important but

[00:14:01.600]
it's not just that it's a gel organizations

[00:14:04.200]
finding that middle ground between

[00:14:06.600]
organizational silos

[00:14:08.600]
and overly centralizing

[00:14:10.799]
to four or I'm inside

[00:14:12.799]
sand analytics because overly

[00:14:15.000]
centralizing something a creates

[00:14:17.000]
bureaucratic structure with

[00:14:19.200]
lowest offer steering and planning

[00:14:21.299]
committees committees and approval level

[00:14:23.399]
and the arguing about prioritization

[00:14:26.000]
and therefore overly

[00:14:28.100]
centralizing support for Bri

[00:14:30.299]
off and creates lots of bottleneck

[00:14:32.500]
it's also processes

[00:14:36.399]
were recently published a report

[00:14:38.600]
on BI Governors bi

[00:14:40.600]
governance is very different from data

[00:14:42.899]
governance B I got one this is all about monitoring

[00:14:45.500]
and understanding what is it that users

[00:14:48.200]
are doing in there a bi an

[00:14:50.200]
analytical Ascend boxes

[00:14:52.200]
and data lakes and 8th of March and

[00:14:54.399]
then selectively hardening

[00:14:56.700]
or production lising than the Mormon

[00:14:58.700]
but what we really want to talk about in

[00:15:00.700]
the next 30-40 minutes

[00:15:02.899]
or Edge RBI platform

[00:15:05.200]
because earlier Generations

[00:15:07.600]
relational databases or earlier

[00:15:09.799]
Generations equal only bi

[00:15:12.799]
tools they

[00:15:14.899]
can support

[00:15:17.000]
Mission critical environment

[00:15:19.399]
but remember that number that

[00:15:21.399]
I shared with you if you find a

[00:15:23.399]
girl that you think your processing

[00:15:25.899]
about 50% of your data but

[00:15:27.899]
I know that you are really managing

[00:15:30.000]
to process and analyze no more than

[00:15:32.200]
20% of your data so clearly

[00:15:34.399]
different type

[00:15:36.399]
of big data and Edge of technology

[00:15:38.600]
is needed to

[00:15:40.600]
support this newer

[00:15:43.100]
generation requirements so was that

[00:15:45.200]
a no longer advicate

[00:15:48.600]
a a simplistic

[00:15:50.600]
a day of architecture where

[00:15:52.799]
I'm sure you you remember how

[00:15:54.899]
we used to draw the side with what we

[00:15:57.200]
used to call layer cake architecture where

[00:15:59.299]
the boredom you had your data

[00:16:01.299]
sources of data integration them in the

[00:16:03.299]
middle you had a data warehouse translator

[00:16:05.799]
Martin of the top of that picture you had a v

[00:16:08.500]
i II and the idealistic

[00:16:11.600]
Nicholas Thompson was that all

[00:16:13.799]
of the data at some point is going

[00:16:15.799]
to end up in the data warehouse well

[00:16:17.899]
guess what 10 20 years

[00:16:20.000]
later somewhere between 20

[00:16:22.500]
to 50% is where we are

[00:16:24.600]
ending up today so how do we finally

[00:16:27.399]
a break for that very hard

[00:16:29.399]
we finally start processing

[00:16:31.899]
and analyzing more than 20%

[00:16:34.399]
of the data while they are in

[00:16:36.500]
different treatments for

[00:16:38.500]
different data layers nights

[00:16:41.200]
are not old at I have to

[00:16:43.200]
be in a data warehouse not

[00:16:45.399]
all day and I have to be over single

[00:16:47.899]
version of the truth because if I speak

[00:16:50.399]
to you who are in the finance

[00:16:53.000]
organization of you are a businesses

[00:16:55.299]
he has two plus two always has

[00:16:57.600]
two equal to 4 even when

[00:16:59.600]
it takes you know a few days

[00:17:01.700]
and the few long of

[00:17:03.799]
a batches to calculate that because

[00:17:06.500]
books of the company really

[00:17:08.700]
need to reconcile so you have no options

[00:17:11.599]
but that's really a relatively small

[00:17:13.599]
percentage of all of your and the price data

[00:17:15.599]
if I'm speaking to you in

[00:17:18.799]
the CMO or a VP

[00:17:20.900]
of sales are you

[00:17:24.799]
wake up in the morning and you read

[00:17:26.900]
The Wall Street Journal and you realize that

[00:17:28.900]
your competition just lowered prices

[00:17:31.400]
introduce the new product

[00:17:33.400]
acquired a new company and you want

[00:17:35.599]
to get out of that company that

[00:17:37.599]
preemptive complain to your customers

[00:17:40.099]
and products today alright

[00:17:42.200]
I figuring out customer segmentation

[00:17:44.299]
for that razor sharp

[00:17:47.500]
razor Focus. Campaign

[00:17:50.000]
is North where

[00:17:52.099]
you need that single version of the truth Gordon

[00:17:54.799]
update where two plus two

[00:17:56.799]
equals 3.9 or 4.10

[00:17:59.700]
is it is good enough so

[00:18:01.700]
good enough but time we dated. Time

[00:18:04.599]
we dated for this particular you stay

[00:18:06.799]
strong obviously

[00:18:12.299]
I to describe two hands

[00:18:14.599]
of the extreme but there are there are different layers

[00:18:17.400]
in here so take a look

[00:18:19.500]
at your requirements

[00:18:21.500]
and tolerances for Layton

[00:18:23.500]
for data quality

[00:18:25.700]
for tolerance for risk etcetera

[00:18:27.900]
etcetera and then and figure out who

[00:18:30.200]
is going to be accessing the data in

[00:18:32.299]
a day late versus a data warehouse because

[00:18:34.400]
they are late where you

[00:18:36.700]
really want to impress store

[00:18:38.700]
close to 200% of your

[00:18:40.799]
data there is no way

[00:18:42.900]
that you can create and govern

[00:18:44.900]
hundred percent of your day in your those

[00:18:46.900]
those organizations that attempt to do that

[00:18:48.900]
basically they take their

[00:18:50.900]
data warehouse they renamed it doesn't

[00:18:53.099]
they don't like and then the same

[00:18:55.299]
challenges are happening there they like

[00:18:57.400]
environment but when you start treating your

[00:18:59.500]
little league differently are you

[00:19:01.500]
you govern it but you gotta run

[00:19:03.599]
there two more likely you

[00:19:05.700]
allow only

[00:19:08.599]
qualified data scientist

[00:19:10.700]
and power analyst will understand

[00:19:12.700]
what they do I access to the

[00:19:14.700]
data Lake and as you go obvious

[00:19:16.799]
that pyramid yes you are

[00:19:19.000]
now applying more for the

[00:19:21.000]
older generation best practices

[00:19:23.299]
in terms of Scyther governance

[00:19:25.500]
and this is where you are single

[00:19:27.500]
version of the truth is and this is where you can

[00:19:29.599]
open up access to your data hop

[00:19:31.900]
and data warehouse to all of the candle

[00:19:34.700]
users with

[00:19:36.799]
tightly controlled ask

[00:19:38.900]
you well and the other

[00:19:41.099]
types of doing it doesn't have to be a

[00:19:43.400]
three layers of your

[00:19:45.700]
organization may be more complex you

[00:19:48.200]
may need to create the Amor later

[00:19:50.400]
alright it could be just a

[00:19:52.599]
filing system of the bottom some

[00:19:54.799]
kind of flexible on

[00:19:57.400]
the edge of camaron read

[00:19:59.700]
type of SQL going

[00:20:02.400]
up the stack and then as

[00:20:04.500]
we go have the stack we we we tighten

[00:20:06.900]
the control

[00:20:08.599]
increase the governors and we

[00:20:10.700]
open up access to more casual

[00:20:12.799]
users because this is where we have more

[00:20:14.900]
control Society on this is this is

[00:20:16.900]
this is the best practices we see this is

[00:20:19.000]
the way you can start breaking

[00:20:21.400]
through that barrier or analyzing

[00:20:23.799]
just 20% of the data interesting

[00:20:27.200]
is that very

[00:20:33.000]
well and I do have quite a few clients who

[00:20:35.200]
are beginning to architect

[00:20:37.500]
they are bi environment for processing

[00:20:40.400]
all of the data Technologies

[00:20:45.299]
orange really set up to

[00:20:48.000]
a two handle that DS all of

[00:20:50.099]
the older generation

[00:20:52.200]
or earlier generation even a lot of courage

[00:20:54.200]
in Eurasian pediatric neurologist can access

[00:20:56.900]
data in battle a credit

[00:20:58.900]
doesn't matter whether they could do about a W

[00:21:01.000]
O Lakes

[00:21:08.599]
these are early

[00:21:10.700]
Generation VI Technologies is still

[00:21:12.799]
sitting outside over the

[00:21:14.900]
data lakes are in other words with still

[00:21:17.200]
bringing the data to D

[00:21:19.200]
I write and when you bring data

[00:21:21.400]
to be on what you're doing is that you are

[00:21:23.599]
moving data in and

[00:21:25.599]
out of class to sell all of that scalability

[00:21:28.099]
inside the class they're

[00:21:30.299]
basically now run through

[00:21:32.299]
this single in a bottle

[00:21:34.900]
necklace single choke point

[00:21:36.900]
of jdbc or

[00:21:39.000]
odbc connector you are

[00:21:41.099]
moving a lot of data Crossroads

[00:21:43.500]
points no matter

[00:21:45.500]
how you store data

[00:21:48.000]
in your data Lake Lobster the

[00:21:50.299]
bi tools still can only

[00:21:52.400]
access it via SQL

[00:21:54.700]
so even though they

[00:21:56.799]
delayed scam can work with unstructured

[00:21:59.099]
data sources in schema schema

[00:22:01.099]
on Reed's Bud to buy

[00:22:03.200]
by the time you architect everything

[00:22:05.400]
that you see here you are still doing chemo

[00:22:08.599]
Wright and Amanda structured

[00:22:11.099]
SQL any

[00:22:13.400]
method is used

[00:22:16.700]
inside the outside

[00:22:19.900]
of the store not really terrible

[00:22:22.599]
example of any kind of

[00:22:24.599]
a multi-dimensional operation

[00:22:27.700]
so if you have a say

[00:22:29.700]
some kind of a relational all app

[00:22:31.799]
engine sitting outside of

[00:22:34.099]
the outside of the question and

[00:22:36.299]
you're issuing one query and

[00:22:38.400]
then using the second query and the only

[00:22:40.799]
thing that you're changing is one

[00:22:43.099]
of the dimensions right so if

[00:22:45.200]
you weren't doing this in a in

[00:22:47.400]
a in the class if you were doing this in

[00:22:50.000]
a you know dimensionally aware

[00:22:52.099]
relational database the relational

[00:22:54.299]
or engine would be smart enough to

[00:22:56.500]
know that I don't really need to

[00:22:58.500]
bring that results at a game

[00:23:00.500]
right I don't need to reactivate you the

[00:23:02.599]
complete query again I just need

[00:23:04.599]
to adjust the aquarium with a new

[00:23:06.700]
dimension but all of that is lost

[00:23:09.200]
inside and outside of the class because

[00:23:11.200]
the only thing that passes back and forth

[00:23:13.400]
is SQL not really in

[00:23:15.500]
any kind of a dimensional awareness

[00:23:18.700]
else out when you run the environment

[00:23:21.200]
not only are using

[00:23:23.200]
methadone but you are not

[00:23:25.400]
really you you think you are linearly

[00:23:27.799]
scalable AWS

[00:23:31.099]
so I could do for you but you're really not

[00:23:33.099]
your queries are distributed

[00:23:35.700]
10 linear scalable but they did the rest

[00:23:37.799]
of the vi environment does

[00:23:39.900]
not to post

[00:23:44.799]
more and more of their components

[00:23:47.400]
are not just the the actual

[00:23:49.599]
career is in the data is

[00:23:52.099]
Syria sitting in the inside the class

[00:23:54.200]
too but more and more with bi component

[00:23:56.299]
Symantec where it's a tree

[00:23:58.500]
is being pushed down into the

[00:24:00.500]
class their self when you do that you

[00:24:02.500]
are not moving data in and out of

[00:24:04.500]
the class they're the reason all that extra

[00:24:06.700]
when I land traffic

[00:24:08.500]
there is no such thing as jdbc-odbc

[00:24:10.900]
because the data

[00:24:13.700]
is naturally as

[00:24:15.700]
any other day and everything

[00:24:18.400]
else and

[00:24:20.500]
you're not limited

[00:24:22.500]
to really sequel sequel

[00:24:24.700]
you can analyze any any kind of files

[00:24:27.000]
and all this with data and metadata

[00:24:29.000]
and now

[00:24:31.099]
in the same place this is what

[00:24:33.099]
we called bringing VI today

[00:24:35.500]
at all right as opposed to Bringing data to

[00:24:37.500]
the ice if you remember the previous find

[00:24:39.700]
a lot of the components that were outside

[00:24:41.799]
of that daughters line with pushed

[00:24:43.900]
down into a into the class

[00:24:46.200]
there so this was this is one of

[00:24:48.200]
the version of this type of a technology

[00:24:50.400]
you still run some of

[00:24:52.400]
the components on an edge

[00:24:54.599]
now so there is a little

[00:24:56.599]
bit old for a single

[00:24:58.599]
threading here and what

[00:25:00.599]
we really want to see is he not 100%

[00:25:04.400]
complete distributed

[00:25:06.799]
architectures

[00:25:08.500]
the only thing that the edge now this

[00:25:10.599]
is doing is rendering but

[00:25:12.799]
all of the last 2 months eclairs

[00:25:15.599]
and cubes and you're in the Seas

[00:25:18.000]
York where is everything is pulled

[00:25:20.200]
down to individual data

[00:25:22.200]
so that it is indeed

[00:25:24.500]
100% distributed and

[00:25:26.700]
at 100% scalable turn

[00:25:32.700]
data architecture this is the view of

[00:25:34.700]
this this is the one of the approaches

[00:25:37.299]
to start analyzing in

[00:25:39.799]
the driving in size from all

[00:25:41.799]
of your data structured

[00:25:44.200]
and unstructured not just 20%

[00:25:46.700]
but 100% and

[00:25:48.700]
do that at a high level of

[00:25:51.099]
scalability Alex

[00:25:56.700]
alternate start with u

[00:25:58.799]
thank you very much for us at this point

[00:26:00.799]
in time I'm going to introduce our next speaker

[00:26:03.000]
Alex couto senior product marketing

[00:26:05.200]
manager at Cloudera

[00:26:07.799]
awesome thank you Stephen thank you Boris

[00:26:10.099]
so as mentioned

[00:26:12.200]
and what forest was peeking around is

[00:26:14.500]
there's a lot of aspects to consider when

[00:26:16.700]
you look at a child the eye

[00:26:18.700]
and how to get more value in an inside

[00:26:21.000]
from your data and one of which will be touching

[00:26:23.400]
on during this part is around me the

[00:26:25.599]
technology side of things a lot of times this

[00:26:27.700]
Challenge on agility can actually

[00:26:29.700]
be due to so

[00:26:33.900]
many of these limitations that we look

[00:26:36.000]
at it and many of them should be familiar to

[00:26:38.400]
to talk on the phone with

[00:26:40.799]
existing infrastructure with

[00:26:42.900]
this limitation on resources

[00:26:45.700]
and prioritization at 2

[00:26:47.799]
to ensure that the reports

[00:26:49.799]
in the SLA is that you're supporting

[00:26:51.799]
today can continue to run it

[00:26:54.799]
difficult to start to bring on new

[00:26:56.900]
user group more data more

[00:26:58.900]
types of reports and use cases

[00:27:01.200]
I there's always a risk having

[00:27:03.299]
to bring those in and so

[00:27:05.500]
it can either be a very lengthy process

[00:27:08.099]
or just a straight

[00:27:10.200]
limitation or cut off

[00:27:12.200]
on on what date of can actually

[00:27:14.299]
be analyzed and who has access and what

[00:27:16.299]
they can do with it and a lot

[00:27:18.299]
of this I can be linked to the cost

[00:27:20.299]
of these systems so it can be very costly

[00:27:22.700]
expense to go in scale-out the system

[00:27:25.099]
possibly include some down time there

[00:27:27.200]
and it's pretty difficult

[00:27:29.299]
to start to justify some of those expenses

[00:27:31.599]
if it's just to be able to better

[00:27:33.599]
support the the workload that are running

[00:27:35.799]
today versus also looking

[00:27:37.799]
at how to bring in Newark loads one

[00:27:40.500]
of the things that we see with our clients

[00:27:43.500]
quite a bit is I

[00:27:45.799]
one way to address this while

[00:27:48.099]
you may have kind of an Enterprise data warehouse

[00:27:50.400]
with a very specific set of reports

[00:27:53.200]
and use cases there's

[00:27:55.400]
this proliferation of different data silos

[00:27:57.500]
that has popped out so there's some multiple

[00:28:00.599]
data copies be moved throughout the

[00:28:02.700]
organization and these different data

[00:28:04.900]
Marts are dressing specific departmental

[00:28:07.299]
needs use case of specific

[00:28:09.299]
users can

[00:28:11.299]
also be pretty difficult to maintain

[00:28:13.400]
across a large Enterprise

[00:28:15.700]
it ends up being a very lengthy

[00:28:18.000]
process as you try and join

[00:28:20.000]
data together as you can open up access

[00:28:22.500]
to them and maintaining all

[00:28:24.500]
those date of copies itself as in a lot

[00:28:26.599]
of in efficiencies there and

[00:28:28.900]
then the other limitations that

[00:28:30.900]
we seen in particular for enabling a lot

[00:28:33.000]
more at self-service is

[00:28:35.500]
this shift away from some

[00:28:37.700]
of the Pre-K and reports and even being

[00:28:39.700]
able to get into more of this self-service provisioning

[00:28:42.200]
enabling

[00:28:44.500]
better empowering I your end

[00:28:46.599]
users to dig and discover new

[00:28:48.700]
insights and oftentimes iCloud

[00:28:51.400]
can be a great way to enable that's and

[00:28:53.400]
give folks dedicated resources there as

[00:28:55.400]
well and

[00:28:57.799]
so when we look at a modern

[00:28:59.900]
pop armor modern opposed to tackle

[00:29:01.900]
that is of course being able

[00:29:04.000]
to provide the same performance

[00:29:06.099]
and concurrency and end

[00:29:08.099]
support for his people skills and

[00:29:10.200]
mvi tool that is a huge necessity

[00:29:12.900]
here but they're the True Value really goes in Breaking

[00:29:15.099]
Beyond just see the

[00:29:17.099]
use cases that you're already supporting into

[00:29:20.200]
a being able to have more data flexibility

[00:29:22.500]
so very easily being able to

[00:29:24.599]
land any and all types of data

[00:29:26.900]
being able to consolidate data from these

[00:29:29.500]
marks or different day two sources and

[00:29:31.799]
not having the model up front I'm so

[00:29:33.799]
lot of those initial benefits of either

[00:29:36.400]
the data like that we tend to speak around it

[00:29:38.599]
so being able to have data in its raw form and

[00:29:40.799]
then that data for

[00:29:43.200]
new use cases I knew question

[00:29:45.299]
that hand side in the state

[00:29:47.299]
of remains open so

[00:29:49.299]
it's never locked into any proprietary

[00:29:51.599]
formats or what not and the advantage here

[00:29:53.799]
is as you start to consolidate Thursday

[00:29:56.799]
all of this data get stored together

[00:29:58.900]
and multiple

[00:30:01.000]
different user groups have access to it so

[00:30:03.000]
that same I dated at the available

[00:30:05.200]
for your bi for your reporting is

[00:30:08.000]
also available for your date engineering teams

[00:30:10.200]
to be able to to process

[00:30:12.799]
order or run ETL jobs over

[00:30:14.799]
it it's available for your data science

[00:30:16.900]
teams you can easily operationalize any

[00:30:18.900]
reports or application from it as well

[00:30:20.900]
and then of course being

[00:30:23.099]
able to you cost effectively as

[00:30:26.000]
scale out at these

[00:30:28.200]
systems without having to make it a

[00:30:30.200]
major maintenance operations so you're not having

[00:30:32.200]
to make any trade offs or questions

[00:30:34.299]
as to which data to be stored or which workloads

[00:30:36.900]
reports that we were on them

[00:30:39.500]
and then finally having this flexibility

[00:30:42.500]
to really be able to leverage

[00:30:45.099]
all of these modern benefits in

[00:30:47.400]
whatever architecture deployment area

[00:30:49.599]
that that you choose bid in on Prime

[00:30:52.099]
environments I be at natively

[00:30:54.099]
against any object store such as ice 3

[00:30:56.200]
or Microsoft a DLS or a hybrid

[00:30:58.400]
combination of those

[00:31:01.799]
and so what would this looks like is

[00:31:04.400]
against time to consolidate a

[00:31:06.500]
lot of these different at datamark space

[00:31:08.799]
that you would start to see within

[00:31:10.799]
your existing environment

[00:31:13.099]
where you would have a a modern

[00:31:15.400]
platform that can support a wide range

[00:31:17.700]
of these use cases that that borders was talking

[00:31:19.900]
around one of those the benefits of having this

[00:31:22.000]
modern platform is being able

[00:31:24.000]
to have more of a logical architecture

[00:31:26.200]
to support some of these cases

[00:31:28.500]
that do have those varying degrees of governance

[00:31:31.200]
so not just being able to

[00:31:33.200]
support Tennessee data science in the exploration

[00:31:35.200]
but also being able to as

[00:31:38.500]
you go to open it up to a

[00:31:41.400]
regular I reporting or

[00:31:43.400]
more right

[00:31:45.500]
GIF applications

[00:31:48.000]
as well and also

[00:31:50.000]
still integrate with a lot of these Enterprise

[00:31:52.200]
data warehouses that we see where

[00:31:54.299]
you might want to push out some subset

[00:31:56.500]
of this data to the e d w i

[00:31:59.099]
to to support again some of

[00:32:01.099]
those higher play some more heavily modeled reporting

[00:32:03.799]
needs the

[00:32:07.099]
one we look at what are some of the steps to get

[00:32:09.200]
there cuz of course the technology piece

[00:32:11.500]
is only one aspect of it though technology

[00:32:13.700]
piece opens up this potential ability

[00:32:16.400]
to really I get more value out of

[00:32:18.400]
your day tomorrow this inside

[00:32:20.700]
striven a model but it's also

[00:32:22.799]
a round reorienting rethinking

[00:32:25.099]
what your team looks like and

[00:32:27.400]
these five steps here are taken from

[00:32:29.400]
with us working with

[00:32:31.599]
it a number of our customers as to how we get

[00:32:33.700]
them at to be successful there and

[00:32:36.500]
what we mean by each of these is

[00:32:38.500]
for building a data-driven culture just

[00:32:40.900]
means to shifting to being able to

[00:32:43.000]
empower your end users to let

[00:32:45.000]
them be able to discover what data is

[00:32:47.000]
a value and be able to ask

[00:32:49.200]
me questions be able to to dig in

[00:32:51.200]
interact with any of the reports that

[00:32:53.299]
you have to

[00:32:55.299]
be able to discover what date of maybe a value

[00:32:57.700]
and then create as you go there

[00:32:59.900]
in terms of a building

[00:33:02.599]
to the right teams and skills this

[00:33:05.000]
is a really evaluating what

[00:33:07.200]
you have in your existing teams

[00:33:09.799]
and and companies you

[00:33:11.799]
know there's always a lot of talk around data scientist

[00:33:14.400]
and how difficult they may be to Define

[00:33:16.700]
but a lot of times did the most valuable folks

[00:33:18.799]
are one sitting within your organization making

[00:33:21.599]
sure there's data Engineers especially are

[00:33:23.599]
part of this development and

[00:33:26.200]
those folks I have deep knowledge of

[00:33:28.400]
the state I can often be trained

[00:33:30.700]
I'm more into the data science realm as well

[00:33:32.900]
as I mentioned

[00:33:36.799]
quite a bit and when the big things here

[00:33:38.799]
is you don't need to over architect

[00:33:40.900]
at 4 for Perfection you don't need to wait

[00:33:43.000]
for everything to be all set

[00:33:45.099]
in stone start small

[00:33:47.400]
take your first use case get that to success

[00:33:49.799]
give folks excited drive that adoption and

[00:33:52.200]
then adding more use cases from

[00:33:54.400]
there and

[00:33:57.099]
then I definitely in terms of how we

[00:33:59.299]
look at these use cases making sure that

[00:34:01.299]
you're actually blinking them to production

[00:34:03.299]
value as well I'm so it

[00:34:05.299]
can be pretty hard to type value to

[00:34:07.299]
just having an a Sandbox environment

[00:34:09.400]
or what not into

[00:34:11.400]
making sure you're thinking of the end business value

[00:34:13.400]
and then finally when we look at being

[00:34:16.199]
able to to right-size data governance as

[00:34:19.099]
far as mentioned there's going to be very in degrees

[00:34:21.099]
of governance needed for each of these different

[00:34:23.400]
applications but that doesn't

[00:34:25.500]
necessarily need to be a limited a platform

[00:34:27.800]
itself so make sure that as you

[00:34:29.800]
look at Solutions and governance practices you

[00:34:32.300]
have a way to make it governance a two-way

[00:34:34.300]
street so that you can get you there driven

[00:34:36.300]
governance as well as highly

[00:34:38.400]
curated governance at the same time and

[00:34:42.099]
then I'll just end up with a quick

[00:34:44.199]
look at at Cutters platform

[00:34:46.900]
for machine learning in analytics optimized

[00:34:49.300]
for the clouds it's really takes this idea

[00:34:51.699]
of bringing data together and

[00:34:53.699]
if haired storage player opening it

[00:34:55.800]
up to multiple different users and

[00:34:57.800]
types of use cases be at analytics

[00:34:59.900]
data science operational

[00:35:02.599]
I use cases or data engineering

[00:35:04.599]
and then also ensuring that trip

[00:35:08.300]
and workloads not only has access to the same

[00:35:10.599]
Fair data but the same shared

[00:35:12.699]
data experience the same data catalog

[00:35:14.900]
security policies and governance to

[00:35:17.199]
ensure that you can really provide the full breadth

[00:35:19.400]
of data access and inside without

[00:35:21.699]
any a risk to the business or

[00:35:24.099]
without any I did

[00:35:26.300]
a copy user and efficiencies within the pot

[00:35:28.400]
farmer and

[00:35:30.400]
with that I'll pass it back over to Steven

[00:35:32.699]
and Steve thank

[00:35:35.000]
you very much Alex at this point in time I'd

[00:35:37.199]
like to introduce Steve willage vice

[00:35:39.400]
president for marketing at Arcadia data

[00:35:43.699]
great thanks thanks for the contact Sports

[00:35:46.199]
and Alex what I'd like to do in the next 10 minutes and

[00:35:48.500]
I will have some time for Q and A's talk about

[00:35:50.599]
the value that you can get from

[00:35:52.599]
the data legs but

[00:35:54.900]
with the attitude bility of

[00:35:56.900]
a visualization tool

[00:35:59.099]
or a bi platform that can really take

[00:36:01.300]
advantage and not only surface

[00:36:03.400]
that information that is

[00:36:05.500]
part of that 80% that may not

[00:36:07.599]
have been utilized already within the Enterprise but

[00:36:09.699]
also Grant so much larger audience Beyond

[00:36:12.300]
just for this scientist

[00:36:14.400]
not going in the market for about 18

[00:36:16.599]
years of work for companies like teradata business

[00:36:19.300]
objects are there to do companies and

[00:36:21.800]
what we've seen or the past 10 years is data

[00:36:24.199]
and platforms of chains right they always

[00:36:26.599]
talked about velocity variety

[00:36:28.900]
of date and how that's changed in the need to have multi

[00:36:31.199]
structured data get more access to unstructured

[00:36:33.500]
data the platforms like fighter

[00:36:35.800]
I have enabled a lot of that by

[00:36:37.900]
supporting multiple storage engine

[00:36:40.000]
search NFC enabling

[00:36:42.400]
people to do schema on read

[00:36:44.599]
as well as he gone right the

[00:36:47.199]
ability to do transformation within

[00:36:49.199]
the platform or even a l

[00:36:51.199]
d t which is Discovery before

[00:36:53.500]
you figure out what you want to transform and

[00:36:55.599]
build structure around for analysis so

[00:36:58.000]
really lot of organizations are

[00:37:00.000]
fine everything need to have both of data warehouse

[00:37:02.300]
and a data Lakes really Mabel was jelly

[00:37:04.599]
I'm all that's happened it really hasn't

[00:37:06.800]
been a lot of innovation bi-layer a

[00:37:08.900]
lot of excitement initially has been around data

[00:37:11.000]
science machine learning all the great

[00:37:13.099]
things with it does but there still an untapped

[00:37:15.500]
need I

[00:37:17.500]
need it's not being served to really an able

[00:37:19.599]
more the front line users and organizations

[00:37:21.699]
to also get value from is

[00:37:23.699]
data likes nuts really what Arcadia data

[00:37:25.800]
was designed to do and what

[00:37:27.800]
we're seeing is that within organizations

[00:37:30.599]
they're creating to bi standards one

[00:37:32.699]
for the warehouse that is optimized

[00:37:34.800]
for all that relation

[00:37:36.800]
of Olaf and things that happened in the two-tier

[00:37:38.900]
way based on the architecture Roosevelt

[00:37:41.300]
the time venality scale-out

[00:37:43.400]
🙂 architectures and

[00:37:46.099]
people can at least leave her refer to these as

[00:37:48.199]
data likes you've got a new opportunity in

[00:37:50.400]
terms of how you can plug DIY

[00:37:52.599]
and Alex and visualization into that

[00:37:54.599]
type of architecture and really Naval news types

[00:37:56.800]
of use cases Sony double click

[00:37:58.800]
on that a little bit you think about the data

[00:38:00.800]
warehouse and relational databases in particular

[00:38:03.300]
the reason why there's a two-tier

[00:38:05.300]
bi architecture is because you could never really install

[00:38:07.699]
the software on

[00:38:09.800]
a relational database it's been heavily

[00:38:12.000]
optimize to work with the

[00:38:14.099]
hardware I saw that that one of

[00:38:16.099]
my former companies are you get tremendous resource

[00:38:18.400]
utilization because these boxes

[00:38:20.400]
were expensive at the time and

[00:38:22.400]
really needed to sign the software in a way that

[00:38:24.400]
locked

[00:38:26.599]
in all that resource for the database itself so

[00:38:28.900]
of course the BS ever existed on

[00:38:31.800]
a separate here and these scale

[00:38:34.000]
up nicely but they don't scale

[00:38:36.599]
out quite as well and more importantly

[00:38:38.699]
from Alex process then you've got to

[00:38:40.699]
optimize your physical

[00:38:42.800]
data storage mechanisms

[00:38:45.199]
your semantically rescue failure securing

[00:38:47.400]
that data you loading I didn't you're doing

[00:38:49.500]
it in two separate locations right

[00:38:51.800]
one some of the data warehouse defining

[00:38:53.800]
connections and I'm doing it again at

[00:38:56.000]
the bi server and then when you talk about Big Data

[00:38:58.000]
if you want native connections to

[00:39:00.000]
things like solar

[00:39:02.199]
index for semi-structured data to

[00:39:05.000]
handle parallel processing in

[00:39:07.099]
the naval real-time insights by definition

[00:39:09.300]
if you're moving data from one

[00:39:11.300]
system to the next Thurs latency there so

[00:39:13.500]
you're missing out on opportunities for

[00:39:15.599]
real-time insights as well on the set of the system

[00:39:17.699]
so that that really hasn't

[00:39:20.000]
worked in the reason why an architecture

[00:39:22.199]
that is truly scale-out Lake

[00:39:24.199]
Arcadia is to Naval

[00:39:26.199]
are these things so I can you date it was built from inception

[00:39:28.500]
to run natively with in

[00:39:30.500]
data lakes and what I mean by that is if

[00:39:32.599]
you think about the open source movement in

[00:39:34.699]
a lot of the openness that's been traded

[00:39:37.500]
from the way sufferance

[00:39:39.800]
to Felts but the ability to plug in

[00:39:41.800]
different processing engines in

[00:39:43.900]
the descale a storage architectures we taking

[00:39:46.000]
advantage of that we let our suffer run

[00:39:48.000]
directly on the data

[00:39:50.099]
knows where the day that exists and it's

[00:39:52.099]
not just the query processing like Boris alluded

[00:39:54.400]
to but it's also all the knowledge about

[00:39:56.599]
how data is stored locally how

[00:39:58.800]
you can create better performance

[00:40:01.000]
schemas and international models

[00:40:03.099]
to take advantage of that and scale-out very

[00:40:05.300]
very linear way

[00:40:06.800]
so that's really what would we do and

[00:40:09.099]
because of that you don't have to

[00:40:11.099]
optimize performance in multiple

[00:40:13.300]
locations you're not moving data in the multiple

[00:40:15.400]
tiers you don't have to secure it in multiple ways as

[00:40:17.900]
an example we can inherit security

[00:40:20.300]
directly from Apache

[00:40:22.599]
Century which is a security

[00:40:24.800]
project for them to her as an example so

[00:40:26.900]
that the administrator doesn't have to configure in

[00:40:28.900]
a separate place to just inherits how is defined

[00:40:31.199]
at the federal level it's

[00:40:33.300]
a and similarly we can

[00:40:35.300]
handle things in real time does data is it

[00:40:37.300]
snowing in is automatically available

[00:40:39.300]
being able to connect to

[00:40:41.300]
Modern systems like to do it center allow

[00:40:43.500]
us to take advantage of some of these different

[00:40:45.699]
abilities that are out there the

[00:40:48.199]
other thing to Arcadia has done to really flip

[00:40:50.400]
the idea of Olaf on its

[00:40:52.400]
head is to enable what we call Smart

[00:40:54.500]
acceleration one of the challenges that we

[00:40:56.500]
seen with Legacy VI

[00:40:58.500]
and middleware applications is that you

[00:41:00.900]
want a building cubes in advance

[00:41:03.099]
based on what you think the business

[00:41:05.199]
requirements are and that be a

[00:41:07.599]
lot of planning and set up time

[00:41:09.699]
not to mention you can lock yourself into

[00:41:12.099]
what dimensions and Views people want to have

[00:41:14.099]
in the reality is we can

[00:41:16.099]
allow people to query data

[00:41:18.400]
and granular Satan really do Discovery

[00:41:20.800]
in a much more agile way to look for insights

[00:41:23.199]
and what questions should be asked and

[00:41:25.300]
we enable through machine learning and artificial

[00:41:27.599]
intelligence we monitor those

[00:41:29.599]
Cleary patterns what day does me an Access

[00:41:31.599]
and we recommend to the administrator smarter

[00:41:33.900]
ways to aggregate

[00:41:35.900]
store cash and physically optimize

[00:41:38.400]
that data in the cluster stored

[00:41:40.699]
directly back on the same story

[00:41:42.699]
here so the next time the screws

[00:41:44.800]
come in there's a cosplay stops in this Asian

[00:41:47.000]
decision that's done and

[00:41:49.500]
how to speed up performance in this is really the last mile

[00:41:51.900]
of getting value from a data Lake where you want

[00:41:53.900]
it to play at the hundreds of thousands of users

[00:41:55.900]
and a customer-facing situation

[00:41:58.699]
and we do it it really based

[00:42:00.900]
again on the actual use is not needed

[00:42:03.099]
to build it all in advance and again this back

[00:42:05.500]
to the divorce is final

[00:42:06.699]
that's tremendous impact of how

[00:42:08.699]
quickly you can get to Insight example

[00:42:11.900]
of this one of our large retail

[00:42:14.699]
cpg companies was trying to

[00:42:16.699]
lock down or knock

[00:42:18.900]
down the silos across a bunch of different friends

[00:42:21.099]
from product marketing services

[00:42:24.500]
in shipment within their organization they really wanted

[00:42:26.699]
to improve at what points

[00:42:28.900]
on a geography basis as well

[00:42:30.900]
as digital media perspective how do we

[00:42:32.900]
influence have to purchase the

[00:42:35.199]
different products that we sell and if we

[00:42:37.199]
run a digital ad lets saying you're up how's

[00:42:39.400]
it impacting sales in different countries

[00:42:41.800]
and different zip codes within those countries

[00:42:44.000]
at cetera and they

[00:42:47.199]
chose to do this on a date of Lake architecture

[00:42:49.599]
happen to be Cloudera in this case and

[00:42:51.900]
now they're supporting hundreds of brand I'm just giving

[00:42:54.099]
them direct access and self-service visual

[00:42:56.099]
analytics across all the different

[00:42:58.500]
components of these marketing campaigns and

[00:43:00.599]
programs and what they say is

[00:43:02.599]
that you took them 3 years to find

[00:43:04.699]
a toilet really allows self service i

[00:43:07.500]
t having to go back and pull out other extracts

[00:43:09.699]
loaded into a BS service here and

[00:43:11.900]
then see if that answered the question for

[00:43:13.900]
the the business analyst you

[00:43:16.300]
can of course then be able to drill down quick

[00:43:19.699]
snapshot it was something like that would look like I'll

[00:43:22.000]
give a live down here in a second another

[00:43:24.099]
interesting area for this company who by the way said

[00:43:27.000]
they identified a billion dollars

[00:43:29.000]
but billion with a be of

[00:43:31.000]
instrumental value from their data

[00:43:33.199]
Lake by working with a business to identify

[00:43:35.300]
different areas where they could save expense

[00:43:38.400]
and find the revenue opportunities one of very interesting

[00:43:40.699]
use case was around supply chain optimization we're

[00:43:43.000]
used to be a six to eight

[00:43:45.000]
month project to bring in a consultancy

[00:43:47.099]
to map out all the ship points

[00:43:49.099]
and products and Freight rates

[00:43:51.099]
at cetera for the different

[00:43:53.099]
routes from manufacturers to wholesalers to

[00:43:55.199]
retailers and it was a six-month

[00:43:57.500]
process do this so if

[00:43:59.500]
you're trying to do what if analysis and

[00:44:01.500]
rates are changing and delivery

[00:44:04.500]
mechanisms potentially change right based

[00:44:06.800]
on those rates then it's hard to do in

[00:44:08.800]
iterate more quickly so they implemented away

[00:44:11.099]
visually to do that with the

[00:44:13.099]
Sankey diagram and path analysis

[00:44:15.099]
to figure out those Rasta Market in a much

[00:44:17.099]
more iterative fashion says pretty

[00:44:19.099]
fascinating know what can be done when you got

[00:44:21.199]
all the data in a visual way to explore

[00:44:23.400]
it and this is being done by business

[00:44:25.599]
analyst not Engineers or righty

[00:44:27.800]
if

[00:44:29.800]
so the other part part I would say that's really interesting

[00:44:32.199]
as if if you're looking at

[00:44:34.199]
modern-day two platforms if

[00:44:36.300]
you're looking at a dead awake and if you treat

[00:44:38.500]
it just like another database you're

[00:44:40.800]
going to fall into the same trap of

[00:44:44.300]
does date of pipelines have been created and

[00:44:46.699]
I don't have time to go to this whole lot of detail

[00:44:48.800]
but if you think about treating a data

[00:44:50.800]
Lake like just like a data warehouse you're

[00:44:54.099]
going to wind up Landing securing that day to give

[00:44:56.400]
me some physical transformation of that and

[00:44:58.699]
then its efforts here building your Samantha

[00:45:00.699]
Claire doing the performance

[00:45:02.800]
optimization and moving the data into that

[00:45:04.800]
to your and then you can start to do antelopes

[00:45:07.699]
Discovery right so that can be days and weeks

[00:45:09.900]
and yeah I worked at

[00:45:11.900]
organizations where to add

[00:45:14.000]
a new dimension to this model

[00:45:16.099]
could take 6 months 12

[00:45:18.300]
months of million dollars of cost and that's

[00:45:20.400]
not a joke so it's like once more

[00:45:22.400]
around the Sun in a year

[00:45:24.699]
before you can go back and do not like this cover

[00:45:26.699]
on this new dimension that you wanted to add in

[00:45:28.699]
there so it just becomes

[00:45:30.699]
a slow process where

[00:45:32.900]
is if you take the date awake and you enable

[00:45:35.199]
that agility that bores talked about it

[00:45:37.199]
much faster way you can really

[00:45:39.199]
shrink down that time the value and days

[00:45:41.199]
so you can query unstructured data if

[00:45:43.500]
you doing Discovery before

[00:45:45.900]
you do performance modeling your model

[00:45:48.099]
it after the fact what you figured out what needs to

[00:45:50.099]
be modeled and optimize and that's really the approach

[00:45:52.400]
that we've taken with how we've been formatted RBI

[00:45:55.000]
software with in the daylight and

[00:45:57.400]
again it's one security model no movement of

[00:45:59.400]
data Etc

[00:46:02.599]
and you take that Alec Discovery process

[00:46:04.599]
from Step 6 down step 3 I

[00:46:06.599]
think you kind of get that point and we got a lot of

[00:46:08.599]
customers that we work with a

[00:46:11.099]
lot of these are with hot air in a lot of different application

[00:46:13.599]
areas obviously customer intelligence and

[00:46:15.900]
inside as one big area Financial Services

[00:46:18.099]
as big telecommunications iot

[00:46:20.500]
analytics is really interesting but the

[00:46:22.500]
example I wanted to give was around cybersecurity

[00:46:25.400]
so with that I'm going to try and

[00:46:27.599]
share my screen here

[00:46:32.400]
and pull up a live application so

[00:46:35.699]
this you know don't try this at home but

[00:46:37.699]
we'll do a live demo here this is

[00:46:39.800]
Arcadia data running on a server

[00:46:41.800]
in our in our office we built

[00:46:43.800]
a demo with Cloud era around a

[00:46:46.099]
project called Apache spot and

[00:46:48.599]
what a patchy spot does it's the community-driven

[00:46:50.599]
approach and open

[00:46:52.599]
source project to fighting cyber

[00:46:54.699]
security threats and it provides

[00:46:57.199]
a open data model or

[00:46:59.699]
way to store data about all the different threat

[00:47:03.000]
intelligence and points etc

[00:47:05.000]
for your organization as

[00:47:07.199]
well as machine learning algorithms to help

[00:47:09.199]
identify suspicious

[00:47:11.199]
activity so I can just providing the front

[00:47:13.199]
end to this and if you look at what we felt

[00:47:15.199]
this is an executive summary of you and

[00:47:17.300]
then I want to come to go to all this in detail but this

[00:47:19.800]
is taking the machine learning

[00:47:21.800]
outfits and visualizing the top threats

[00:47:24.199]
across users and

[00:47:26.400]
points and that works and is it security

[00:47:28.599]
analyst being able to

[00:47:30.599]
see this all in a single-pane-of-glass and

[00:47:32.699]
do your forensic

[00:47:34.800]
analysis to record within one system is

[00:47:36.800]
huge because typically your your swivel

[00:47:39.000]
chair analytics moved from one system to another trying

[00:47:41.199]
to look at threat intelligence in the bluecoat proxy things

[00:47:43.699]
and then you know your active directory

[00:47:45.900]
and now you can look directly at all Network

[00:47:48.500]
traffic in the organization you

[00:47:50.699]
can get a timeline view of what's happening over

[00:47:52.800]
time we've got a network graph

[00:47:54.800]
to look in bubble up a couple

[00:47:56.900]
dozen suspicious activities

[00:47:59.400]
and we might want to drill into an account

[00:48:01.500]
on the bottom left for using machine learning to

[00:48:03.500]
Bubble Up specific threats

[00:48:05.500]
that have been identified by Source IP

[00:48:07.800]
but this is interactive I can go

[00:48:09.800]
ahead and grab a Time slider and focus

[00:48:12.300]
down my right

[00:48:14.800]
now said I want to look at potentially if

[00:48:17.000]
I wanted to look at this specific

[00:48:19.199]
IP address I can just click on that it's

[00:48:21.900]
going to pull up more information about that IP

[00:48:24.099]
address and we can look up and that Warcraft all

[00:48:26.300]
the different endpoints that this this IP

[00:48:29.099]
address is connected to an end drill into detail

[00:48:31.400]
by going down here and

[00:48:33.599]
again because it's a security data

[00:48:35.599]
like if you were we got all the data from the organization

[00:48:37.900]
in one place not just 20% a hundred

[00:48:40.000]
percent and they're bringing and data from

[00:48:42.300]
outside sources potentially as well so yeah

[00:48:44.300]
not a lot of time but you could different

[00:48:49.699]
domains here and in this case Russia

[00:48:52.000]
that is bubbled up at suspicious you can go

[00:48:54.099]
down and look at what was creation dates

[00:48:56.199]
for these different things that were

[00:48:58.300]
happening all the details right there so I

[00:49:00.500]
can drill and get a lot of detail going to different

[00:49:02.599]
users are connected to etc

[00:49:04.699]
etc so a very quick

[00:49:07.000]
flyby and will provide links to other videos

[00:49:09.099]
and demos after the fact that

[00:49:11.099]
I just wanted to share what this kind of can

[00:49:13.099]
look like from an end-user perspective and again

[00:49:15.099]
this is not a data science

[00:49:17.300]
workbench were taking the

[00:49:19.800]
great work that day to sign Joseph done we just

[00:49:21.800]
bring it into it intuitive interface

[00:49:24.699]
to and do spread analysis

[00:49:26.699]
again from a cyber-security perspective

[00:49:30.699]
so that'll turn it back over I

[00:49:32.900]
think we were going to put

[00:49:35.500]
some slides and can wrap things up for questions

[00:49:37.599]
I was just just one

[00:49:39.699]
last thing here on this light is there

[00:49:41.900]
is some research the divorce is done around

[00:49:44.000]
what at the Times called native

[00:49:46.000]
to Duke bi so this gets into more specifics

[00:49:48.500]
about these are distributed bi architectures

[00:49:51.199]
the differences over traditional we got

[00:49:53.199]
some other demos we built with this

[00:49:55.900]
one's around the connected-car Fleet

[00:49:57.900]
Management if you want to get started with Arcadia instant

[00:49:59.900]
weight of a completely free download you can get installed

[00:50:02.599]
on your desktop sample data sets

[00:50:04.800]
and you can start to make

[00:50:07.800]
your time and thanks for joining us today now would love to

[00:50:09.800]
take some question

[00:50:12.000]
thank you very much Steve we're going

[00:50:14.099]
to move into questions from our viewers

[00:50:16.099]
today and our first question is for Boris

[00:50:18.599]
Boris I already have other bi

[00:50:20.800]
tools why do I need to consider Arcadia

[00:50:23.500]
data so

[00:50:25.599]
I think it's the part of the exactly

[00:50:27.599]
the same a conversation we had

[00:50:29.599]
earlier is that the fees if you are

[00:50:31.699]
okay with looking at just the

[00:50:33.699]
20-30 40% of your data

[00:50:35.699]
and if you have to forming a

[00:50:37.699]
structured data analysis and

[00:50:39.800]
other words that you are operating in

[00:50:42.400]
a schema on right environment

[00:50:44.500]
where everything is already

[00:50:46.500]
predetermined fix etcetera then

[00:50:49.300]
probably experiencing

[00:50:52.099]
white wide

[00:50:54.099]
area network experiencing

[00:50:56.599]
data data

[00:50:58.800]
replication definitely

[00:51:00.800]
a plane to off environments

[00:51:03.000]
where that is really all you need but

[00:51:05.099]
I think once you start getting into

[00:51:07.699]
a terabyte of data they want

[00:51:09.699]
to start getting into multiple

[00:51:11.800]
day what types and much much more importantly once

[00:51:14.400]
you start getting into environments

[00:51:16.500]
where you can't really wait

[00:51:19.000]
even for a few weeks for

[00:51:21.199]
your relational database

[00:51:24.099]
administrator to

[00:51:26.300]
change of value in a

[00:51:28.300]
column to create a new joint

[00:51:30.800]
with a primary foreign

[00:51:32.900]
key 50 if your environment calls

[00:51:35.099]
for more edge of

[00:51:37.199]
much more responsive type

[00:51:39.900]
of environment where you are dressed as your requirements

[00:51:42.500]
within hours as opposed to days

[00:51:45.099]
a week so I think that that's when you really

[00:51:48.300]
start to I need to look

[00:51:50.300]
elsewhere and

[00:51:52.800]
therefore would we really are seeing

[00:51:54.900]
today is that. Probably no one

[00:51:57.000]
out there in Barnesville

[00:52:00.500]
Enterprise capitalization

[00:52:02.800]
actually has a single bi tool

[00:52:11.900]
live happily ever after with without

[00:52:14.099]
abijah colleges

[00:52:17.000]
yeah I mean one

[00:52:19.400]
point that I was talking about is some of the large

[00:52:21.500]
organizations we work with are or

[00:52:23.500]
Tuesday multiple standards one standard

[00:52:25.500]
for the day like in one for the day warehouse It's

[00:52:27.800]
not that I want to

[00:52:29.800]
replace all the good work they've done on the data warehouse

[00:52:32.000]
but they want to open up some the different mechanisms

[00:52:34.300]
on top of modern-day to platforms

[00:52:36.599]
like pot era so that's

[00:52:39.400]
what we're starting to see is he's kind

[00:52:41.400]
of neutrons and standards merging

[00:52:45.699]
understood our next

[00:52:47.800]
question is for Alex Alex how do

[00:52:49.900]
you balance the needs for data governance as you

[00:52:51.900]
shift to more Self Service access

[00:52:54.300]
and Analytics

[00:52:56.400]
yeah happy to speak today so like

[00:52:58.599]
I mentioned one of the things that I think around

[00:53:00.599]
data governance is

[00:53:03.500]
being able to have it become to this

[00:53:05.699]
two-way street when you start moving into

[00:53:08.099]
a modern platform where

[00:53:10.099]
you have more data coming in at

[00:53:12.300]
different speeds as well so you may have stayed

[00:53:14.400]
a landing in real time you may have data coming in

[00:53:16.500]
and batch updates and

[00:53:18.800]
all of that data may not have

[00:53:21.099]
initially a known value

[00:53:23.199]
or youth case right out of the gate

[00:53:25.400]
on but some of that data as we spoke

[00:53:27.500]
to four as supporting some

[00:53:29.500]
of your production needs will definitely

[00:53:31.599]
kind of be supporting you ain't no news case

[00:53:33.800]
going into it and

[00:53:36.400]
so I being able to have

[00:53:38.400]
this almost decentralized

[00:53:40.900]
also

[00:53:45.800]
being able to have a deposit

[00:53:48.000]
form as well as DD process

[00:53:50.199]
and a people to be able to

[00:53:52.400]
Stuart and treat the date

[00:53:54.500]
of four sets of known you four

[00:53:56.800]
sets of reports maybe

[00:53:59.000]
regular dashboards that are going out to Executive

[00:54:01.500]
teams being able to trust that that

[00:54:03.699]
the data being used there is accurate

[00:54:05.900]
and then being

[00:54:07.900]
able to have the data also be

[00:54:10.300]
open for immediate access for

[00:54:12.300]
the users to not necessarily having to go

[00:54:14.300]
through that very linear accuration

[00:54:17.000]
aspect I'm the one that benefits

[00:54:19.099]
was ridiculous is

[00:54:21.699]
a government policies and

[00:54:24.000]
metadata management can be added

[00:54:26.199]
as you go so as you see

[00:54:28.199]
data being used More Often by

[00:54:30.199]
different teams as you see regular tables

[00:54:32.599]
are call instead of being accessed

[00:54:34.800]
and be put into new dashboards

[00:54:37.099]
you can actually adding governance policies

[00:54:39.599]
As you move to the platform and

[00:54:43.199]
even open this up to have your

[00:54:45.199]
end users participate in this a

[00:54:47.500]
collaborative governance it as we call

[00:54:49.500]
it so and users can add their

[00:54:51.500]
own discoverability their own dye

[00:54:53.599]
packing their own stewardship to data

[00:54:56.300]
they can better work across teams

[00:54:58.500]
in the one of the big

[00:55:00.599]
areas is it is to really break free

[00:55:02.699]
from a lot of is linear mindset

[00:55:04.800]
that really works for kind of a set number

[00:55:06.800]
of use cases but

[00:55:09.199]
you don't want that to be a limiting factor as you

[00:55:11.199]
started to open up broader access so

[00:55:14.099]
having a platform in a meditative management

[00:55:16.400]
plan that can address all

[00:55:18.500]
of those understood

[00:55:21.900]
thanks Alex Steve our

[00:55:23.900]
next question is for you or

[00:55:25.900]
is artificial intelligence and machine learning

[00:55:27.900]
leverage in big data analytics how

[00:55:30.199]
can bi tools leverage this

[00:55:34.699]
yeah that's really needs to be part

[00:55:37.099]
of what I showed him the demonstration was how you

[00:55:39.300]
can visualize the output from machine

[00:55:41.800]
learning and artificial intelligence

[00:55:44.400]
like within the Apache spot project I

[00:55:46.900]
said the other thing that's doesn't get a lot of

[00:55:48.900]
press is that a lot of Technology

[00:55:51.199]
like Arcadia Gator using machine

[00:55:53.199]
learning within the product itself so

[00:55:55.500]
I talked about the smart acceleration technology which

[00:55:57.699]
using machine learning to analyze

[00:56:00.300]
and recommend different ways to speed

[00:56:03.300]
up the queries of the data on an ongoing basis

[00:56:05.699]
one thing I didn't get a chance to shows

[00:56:08.000]
we also use that intelligence

[00:56:10.800]
to make it easier for the end-users

[00:56:13.300]
we can actually do what we call instant visual

[00:56:15.599]
so you can look at it at Ascend based on the

[00:56:17.599]
dimensions that you selected our

[00:56:20.300]
system will actually recommend the

[00:56:22.599]
best visualization technique and you'll see a

[00:56:24.599]
pallet of like 6 or 9 different visual

[00:56:27.699]
types displayed on the screen with

[00:56:29.699]
your actual data and he's are based

[00:56:31.699]
on rules and learning from

[00:56:34.300]
them on how people are

[00:56:36.400]
selecting visuals in and what are the best visuals

[00:56:38.500]
to use so we're absolutely to accelerate

[00:56:40.699]
that and make it easier for people to analyze

[00:56:43.699]
these big data sets

[00:56:46.400]
understood thanks Steve for

[00:56:48.699]
respect to you how do I sell a

[00:56:50.699]
bi project / platform

[00:56:52.800]
to the business sponsors I

[00:56:58.599]
guess probably overly simplistic on

[00:57:00.900]
says you should get another job because

[00:57:03.000]
if you are working for a foreign

[00:57:05.300]
organization where business Executives

[00:57:07.699]
on seeing the value of this and

[00:57:09.800]
then there are plenty of others went

[00:57:12.099]
where they do but the kitty kidding

[00:57:14.300]
aside and obviously

[00:57:16.599]
applies across different type

[00:57:18.800]
of technology is not know just about

[00:57:21.199]
this particular type of Technology but

[00:57:23.400]
I am a huge fan of

[00:57:26.099]
Rapid the proof-of-concept

[00:57:28.099]
price of Steve talked about talked

[00:57:30.300]
about that

[00:57:34.699]
you can download them the immediately

[00:57:36.800]
reported to action so that that's precisely

[00:57:39.000]
what I would recommend build

[00:57:41.699]
build a day

[00:57:43.800]
or a couple of days to build

[00:57:46.500]
see if your proof

[00:57:48.500]
of concept can provide an indication

[00:57:51.000]
of what the tangible value

[00:57:53.300]
of wood with tangible

[00:57:55.599]
outcomes you you're going to a

[00:57:57.699]
support such as for example if

[00:57:59.699]
a customer care what say you are struggling

[00:58:02.099]
with custom return you don't understand the rules

[00:58:04.099]
cause you a

[00:58:10.599]
bit of a root cause analysis

[00:58:12.599]
and you you find that would cause you could potentially

[00:58:14.599]
predict that X

[00:58:16.599]
percent of your clients

[00:58:18.599]
are going to stay with you or come back to you and

[00:58:20.900]
that that's a gold mine

[00:58:23.000]
or over the opportunity to again take

[00:58:25.000]
for your business Executives and say okay

[00:58:27.599]
can you find this project to

[00:58:30.199]
to scale the South

[00:58:32.400]
understood that's actually

[00:58:34.500]
all the time we have four questions today

[00:58:36.500]
we apologize that we weren't able to get to all

[00:58:38.500]
your questions but as I stated earlier all

[00:58:40.699]
questions will be answered via email

[00:58:42.800]
I'd like to thank our speakers

[00:58:44.900]
today Boris Edelson vice president and principal

[00:58:47.099]
Analyst at farster Consulting

[00:58:49.199]
Alex guto senior product

[00:58:51.199]
marketing manager at Cloudera and

[00:58:53.800]
Steve will advise president of marketing

[00:58:55.800]
at Arcadia data if

[00:58:58.000]
you would like to review this presentation or send it to

[00:59:00.000]
a colleague you can use the same URL

[00:59:02.099]
that used for today's live event

[00:59:04.099]
it will be archived and you'll

[00:59:06.199]
receive an email tomorrow once the archive is posted

[00:59:08.900]
now as we stated

[00:59:11.000]
earlier Just for participating in today's

[00:59:13.099]
event someone would win a $100

[00:59:15.599]
American Express gift card and

[00:59:17.800]
the winner today is Chris Webber

[00:59:19.900]
Chris we will be in touch via email

[00:59:22.000]
so you can claim your prize thank

[00:59:24.599]
you everyone for joining us today and we hope to see you

[00:59:26.599]
again soon