Browse and analyze Apache Kafka® topics with Arcadia Data
- Read our recent announcement on Arcadia Instant for KSQL, for getting started with visualizations on Kafka with free downloads.
- Read about our recent product release with Kafka Support.
- Read our recent blog on Arcadia Instant for KSQL.
- Read our blog on our latest release.
- Learn more about streaming analytics in data lakes in our webinar with 451 Research, Accelerating Data Lakes and Streams with Real-time Analytics.
Visualize Kafka topics with our
Take these four simple steps:
- Read the Getting Started Guide.
- Download and install Arcadia Instant.
- Get the KSQL/Kafka Docker image per the Getting Started Guide.
- Follow this guide to build a working dashboard on Apache Kafka.
SEE ARCADIA IN ACTION
Streaming Data using KSQL
Arcadia Data and KSQL IoT Demo
Connected Vehicle Demo
Streaming Data Visualizations with Arcadia Enterprise
Streaming data environments related to the Internet of Things, change data capture, and other time-series data sources keep growing in popularity. A technology-based catalyst for such environments is Apache Kafka, a platform for managing ordered event data in a publish-subscribe model. And while Kafka has been vital for the data management aspect of streaming data, challenges remain around how business analysts can easily derive real-time insights.
Watch an integration of Arcadia Enterprise and KSQL in action.
The traditional way
A typical configuration for real-time environments today involves using an intermediary store. This approach entails storing streaming data in a staging area, especially a data store like Apache Solr™, Apache HBase™, or Apache Cassandra™, etc., and querying the data from there.
Challenges with the traditional way include:
- Complicated to set up, especially if the intermediary store is separate from the system-of-record store
- Staging inhibits real-time data access
- Polling the store limits scalability across many clients
- Requires data modeling for the staging store
- No ability to ask dynamic questions of the stream
- Not self-service since significant IT work is required along the way
- Non-real-time visuals, as traditional BI tools require manual refreshes to redraw the screen
- Heavy dependence on IT, as many streaming analytics frameworks require coding
Arcadia Enterprise and Confluent's KSQL address these issues:
The integration of Arcadia Enterprise with KSQL opens up streaming data to a large user base of business analysts. KSQL leverages SQL as an interface to Kafka streams/topics, so queries can be run directly on the event data without an intermediary store.
As a native visual analytics platform, Arcadia Enterprise provides the visualizations on streaming data to offer capabilities such as:
- A real-time machine learning or alerting system notices a situation and issues an alert, or incident for your subject matter expert to investigate.
- The user may want a real-time dashboard about what is happening, i.e., cybersecurity, healthcare monitoring, etc.
Pivot from historic forensic analysis into real-time in the same application
- A user is looking through deep historic information with traditional OLAP techniques and they find something interesting.
- They then want to pivot into a real-time view of the data to test their theory, i.e., misbehaving device, underperforming marketing campaign, fraud at an ATM, etc.
Example use cases include:
Visually identify trends and patterns that reveal impending failures in equipment.
Financial services risk reporting
Get immediate understanding of capital market risk to make quick decisions about asset allocation and continuously ensure compliance.
Quickly detect and alert on incidents that appear to be anomalous behavior in network log data that may represent a breach.
Monitor vehicle usage to understand route efficiency, driver efficiency, and traffic patterns
Network device performance
Detect failures and respond to hot spots in communication networks to optimize performance and improve customer satisfaction.
Data quality checks
Identify data quality upstream in the pipeline to avoid downstream errors that lead to bad or delayed decisions.