How Spark is Enabling the New Wave of Converged Applications
Apache Spark has become the de-facto compute engine of choice for data engineers, developers, and data scientists because of its ability to run multiple analytic workloads with a single compute engine. Spark is speeding up data pipeline development, enabling richer predictive analytics, and bringing a new class of applications to market.
However: Is Spark alone sufficient for developing converged applications? How can you speed up the development of applications which span across Spark and other frameworks such as Kafka, NoSQL databases, and more?Join us to get answers to these questions. You will also:
- Learn about critical requirements to enable enterprise-grade Spark applications
- Hear about examples of leading-edge Spark applications and their architectures in industries like manufacturing, telecommunications, and hightech
- See a demo of Spark Streaming with MapR Streams