Cubo Pago

Driving Success with Cube Pago through BigQuery and Looker
Services
Data Engineering
Analytics and Business Intelligence
Creating Dashboards
Description

Cubo Pago, an innovative Mexican fintech startup, collaborated with Semantiks to explore the potential of Google Cloud Platform (GCP) and take advantage of advanced business intelligence (BI) capabilities. With a focus on data analysis, Cubo Pago sought to optimize the management and reporting of its data, maximizing the use of GCP tools, particularly BigQuery and Looker, to improve decision-making and business operations.

No items found.
The Challenge

Cubo Pago faced challenges related to data management and business intelligence. Their data resided in several systems, including PostgreSQL databases hosted on AWS RDS. They needed a scalable solution that could extract, load and transform data into actionable information. The goal was to implement a fluid data pipeline that could handle large volumes, automate workflows, and provide interactive, self-service analysis for their team. In addition, they sought to create an intuitive BI platform that would allow their stakeholders to access and analyze data efficiently.

Our Approach

Instead of holding a workshop, we opted for a direct collaboration with Cubo Pago, designing a tailor-made solution based on their specific needs. The approach focused on establishing a robust data pipeline that would leverage BigQuery as a data warehouse and Looker as a business intelligence tool. Key elements of the approach included:

  • Establish an ELT (Extract, Load, Transform) process using Google Cloud Storage (GCS) and BigQuery to integrate data from PostgreSQL hosted on AWS RDS.
  • Automate data ingestion using Fivetran to simplify the ETL process.
  • Use BigQuery to transform and materialize data, creating a solid foundation for detailed analysis.
  • Configure Looker to develop visualizations, reports and dashboards that allow the team to explore data interactively.
The Solution

The solution implemented for Cubo Pago included several key components:

Data Integration:

  • Data was manually extracted from the PostgreSQL database hosted on AWS RDS, divided into manageable fragments and loaded into GCS.
  • External tables (L0) were created in BigQuery that pointed to the raw files stored in GCS, providing a scalable and fault-tolerant solution for handling PostgreSQL data.

Data Transformation:

  • BigQuery was used as a transformation engine, where data was materialized, cleaned and aggregated (L1 to L3) to ensure its readiness for reporting and analysis.

Business Intelligence with Looker:

  • Looker was configured to create data views and persistent derived tables (PDTs) to improve performance.
  • Two Explores were developed in Looker, allowing Cubo Pago teams to easily analyze transaction data and merchant activities.
  • Dashboards were designed that provided insights into key performance indicators (KPIs), enabling BI self-service for business users.

Automation:

  • Automations were implemented through programmed queries in BigQuery and data ingestion through Fivetran, reducing the need for manual interventions and ensuring timely updates.
The Results

The proof of concept (PoC) project with Cubo Pago was a resounding success, achieving the following key results:

  • Optimized Data Pipeline: The ELT process established a reliable and scalable pipeline for extracting, transforming and loading data, allowing Cubo Pago to handle data more efficiently.
  • Advanced Analysis: Looker's dashboards offered actionable insights, making it easy to analyze transaction data, track merchant activities, and monitor KPIs.
  • Self-Service BI: The configuration allowed Cubo Pago to empower its teams to explore data, create personalized reports and make decisions based on data across the organization.
  • Scalability and Automation: The use of BigQuery and Fivetran ensured that the data pipeline was scalable and automated, reducing reliance on manual management and allowing the business to focus on obtaining insights instead of managing data.
Looking Ahead

Cubo Pago is exploring new opportunities to improve its data analysis capabilities, including the automation of the ingestion of new and updated records. Possible future improvements include the use of Google Cloud SQL to simplify integration with PostgreSQL, eliminating the need for manual exports and allowing the database to be consulted directly from BigQuery. This would further streamline the data pipeline, facilitating real-time processing and improving operational efficiency.

Conclusion

The successful implementation of BigQuery and Looker for Cube Paid exemplifies how taking advantage of Google Cloud Platform can generate powerful data solutions. By partnering with Semantiks, Cubo Pago not only optimized its data management, but also empowered its teams with tools to extract valuable insights and make informed decisions. Semantiks remains committed to helping clients like Cubo Pago to make the most of their data potential to achieve business success.

Visit this customer