Best of Real-Time Analytics2025

  1. 1
    Article
    Avatar of foojayioFoojay.io·1y

    Event-Driven Architecture and Change Data Capture Made Easy

    Event-Driven Architecture (EDA) and Change Data Capture (CDC) are key techniques in modern software systems. EDA relies on components producing and consuming events to trigger actions, making systems flexible and scalable. CDC tracks database changes and converts them into events for other systems. EDA is used for decoupling services and real-time communication, while CDC synchronizes data and powers analytics. They can be used together for combining decoupled workflows with real-time data tracking. Understanding when to use each can help build efficient and maintainable systems.

  2. 2
    Article
    Avatar of tinybirdTinybird·52w

    Using LLMs to generate user-defined real-time data visualizations

    Developers are increasingly using Tinybird to track LLM usage, costs, and performance in AI applications. A new app template called the LLM Performance Tracker allows users to generate real-time data visualizations. The core components include a Tinybird datasource, a Tinybird pipe, a React component, and an AI API route. The backend processes user input to generate chart parameters, while the frontend visualizes the data. This approach emphasizes the importance of performant analytics backends and cautious LLM usage for secure and scalable data visualization.

  3. 3
    Article
    Avatar of cratedbCrateDB·17w

    Distributed Search Engines and Real Time Analytics at Scale

    Distributed search engines partition data across multiple nodes to handle massive datasets with low latency, but struggle with complex aggregations, analytical queries, and joins. Modern workloads increasingly require both search and real-time analytics capabilities in a single platform. The article explores how distributed search architectures work, their limitations, and the convergence toward unified analytics databases that treat search as one capability among many, rather than a standalone engine requiring separate infrastructure.

  4. 4
    Article
    Avatar of tinybirdTinybird·17w

    Build a Real-Time E-Commerce Analytics API from Kafka in 15 Minutes

    A step-by-step guide to building a real-time e-commerce analytics API using Kafka as the data source. Covers connecting to Kafka, ingesting order events, enriching data with dimension tables and PostgreSQL, creating materialized views for pre-aggregated metrics, and exposing multiple API endpoints. The tutorial progresses from a basic 5-minute setup querying raw Kafka data to advanced features including data enrichment, automated PostgreSQL syncing, and optimized aggregations using materialized views. All implementation uses SQL and configuration without requiring application code.

  5. 5
    Article
    Avatar of singlestoreSingleStore·1y

    Can a Database Be Faster Than a Formula 1 Engine?

    Formula 1 cars generate vast amounts of real-time telemetry data, which is crucial for strategic decision-making. Each car is fitted with 300 sensors producing 1.1 million data points per second. Teams use this data for simulations, performance analysis, and strategy adjustments. SingleStore's real-time analytics capabilities are highlighted, showcasing its ability to handle high-throughput data streams and provide millisecond response times. The post includes a practical guide for setting up a data ingestion and visualization simulation using SingleStore, Confluent Kafka, and Grafana.

  6. 6
    Article
    Avatar of tinybirdTinybird·51w

    dbt in real-time

    Tinybird offers an alternative to dbt for real-time analytics, simplifying the process of migrating API use cases from dbt. It provides built-in support for real-time processing, API endpoint creation, and simplifies the tech stack by consolidating all data operations. Tinybird uses ClickHouse for faster performance, especially for API responses. Migrating involves mapping dbt concepts to Tinybird equivalents, such as materialized views for incremental updates, and creating optimized data source schemas.

  7. 7
    Video
    Avatar of techwithlucyTech With Lucy·31w

    The Future of Databases is here... (What you need to know)

    Modern databases are evolving to handle both transactional and analytical workloads in real-time, eliminating the need for complex multi-database architectures. Traditional setups require separate systems for applications and analytics, causing data delays and increased complexity. New unified database platforms can process transactions, run analytics, and support AI workloads simultaneously, enabling instant dashboards, real-time fraud detection, and live inventory systems. The future points toward databases that natively support vector search, unstructured data, and AI workloads without requiring multiple tools or complex data pipelines.

  8. 8
    Article
    Avatar of detlifeData Engineer Things·49w

    Build a Streaming Deduplication Pipeline with Kafka, GlassFlow and ClickHouse

    The post provides a step-by-step tutorial on setting up a real-time data pipeline utilizing Kafka, GlassFlow, and ClickHouse. It focuses on resolving duplicate data issues in streaming pipelines through GlassFlow's deduplication technique, enhancing performance and data integrity before storage.