Meet Your AI Copilot fot Data Learn More

Streamlining Analytics Reporting on Big Data at Tableau Conference 2019

Author
Saswata Sengupta
Senior Solutions Architect, Kyligence
Dec. 02, 2019

Tableau Conference 2019 was a great event for organizations looking to get more insights out of their data. This year’s conference had an amazing turnout of over 19,000 attendees and brought together executives, business intelligence (BI) and data warehouse managers, and data analysts from across every industry.

The conference proved to be a nice opportunity to take a step back and see how the whole data and analytics ecosystem has evolved in the form of new companies that have shown up and what vendors are focused on this year.

Tableau Booth Discussion
Tableau Was a Great Opportunity to Exchange New Ideas

You have companies like Alteryx helping to build AI and Machine Learning models via visual development studio, Dremio who is virtualizing data for enterprise access to InterWorks, and HCL who is delivering top-notch consulting for BI implementations. There was no shortage of new and exciting ideas, but it was also very clear that some of the same challenges attendees were facing last year still hadn’t been solved.

What Folks Were Talking About at Tableau Conference 2019

Having spent time speaking with both conference-goers and members of the Tableau team, it was quite evident to me that Tableau, as a BI and visualization tool, has emerged as the clear winner in the minds of many in the Data and Analytics industry.

Why? Because when it comes to self-service reporting and dashboard solutions, there’s nothing else out there that is as easy to use for both analysts and the leadership they provide insights to.

Glenn in the Data Pool
There Was Plenty of Fun to Be Had in the Tableau Data Pool

In most cases, enterprises are enabling their Tableau users and developers by implementing data marts or a data storage layer, but many enterprises still have yet to figure out how to effectively leverage Tableau over their Big Data or Data Lake investments - either on-premises or in the cloud.

This, of course, creates plenty of additional overhead in the ETL process. Business users have to wait to access the data and there is no way of getting business insights in real-time.

Sharing the Benefits of Kyligence with the Tableau Community

It doesn’t need to be this way, however. One of the best parts about my job is being able to connect with folks in the industry and help them find ways of improving the great work they’re already doing. Tableau was no exception to this and I enjoyed having an opportunity to share how Kyligence is uniquely positioned to solve the problems I’m talking about.

Because Kyligence can operate natively in Big Data environments, it reduces the need to invest in additional servers or other costly resources. Additionally, our direct connector to Tableau helps analysts visualize their Big Data without being required to move the data out of their data lake. Finally, being available on major cloud platforms like Microsoft Azure and Amazon Web Services (AWS) means enterprises can enjoy all of the benefits that come with using Azure Data Lake (ADL) or S3.

Team Kyligence at Tableau 2019
The Kyligence Team at Tableau Conference 2019

Combining Kyligence with your data lake and Tableau solution is a clear winner when it comes to supercharging your visualization and reporting on Big Data. If you’re like most of the folks I spoke with, you’re probably a bit more curious about Kyligence and would like to learn more.

If you’d like additional information, I’d recommend visiting our product pages for Kyligence Enterprise and Kyligence Cloud, and If you want a really great place to start, I’d urge you to download a copy of our solution overview guide or schedule a 1:1 demo with me or another awesome member of the Kyligence team.

Download Kyligence Solution Overview Guide

Schedule a 1:1 Demo