In the enterprise, data is the most valuable—and the most fragile—asset. At DuskByte, we don’t just "migrate" data; we modernize its entire lifecycle. As a Senior Data Engineer, you will be responsible for extracting business intelligence from 20-year-old legacy schemas (SQL Server, Oracle, MySQL) and architecting high-performance, cloud-native data environments that power the next generation of AI and Analytics.
What You Will Do (The Role)
Legacy Schema Refactoring
Analyze and normalize fragmented, unoptimized legacy databases into performant, scalable cloud architectures.
Zero-Downtime Data Migrations
Execute complex data cutovers using tools like AWS DMS, GCP Dataflow, or Azure Data Factory with 100% integrity.
Modern Data Stack (MDS) Implementation
Build automated ELT/ETL pipelines using tools like dbt, Airflow, or Prefect to replace brittle, manual scripts.
Enterprise Analytics Foundations
Design and maintain data warehouses on BigQuery, Redshift, or Snowflake, enabling "Single Source of Truth" reporting for B2B stakeholders.
Security & Governance
Implement row-level security, data masking, and PII encryption to ensure compliance with GDPR, HIPAA, and SOC2.
The Data Architect Tech Stack
We expect mastery of the tools that move and store the world's data
The "Classic" Core
Expert-level SQL (PostgreSQL, MySQL, SQL Server) and experience with legacy Oracle or On-prem storage.
Cloud Warehousing
Deep experience in Google BigQuery, Amazon Redshift, or Snowflake.
Orchestration & Transformation
Python (Pandas/PySpark), dbt, Apache Airflow, or Dagster.
Cloud Migration Tools
AWS DMS, GCP Database Migration Service, Fivetran, or Airbyte.
Streaming & Real-time
Experience with Kafka, RabbitMQ, or AWS Kinesis is a major plus.
Who You Are (Requirements)
The "Data Integrity" Obsessive
You understand that 99.9% accuracy is a failure. You build automated validation checks into every stage of the pipeline.
The Performance Specialist
You know exactly how to optimize a query for a 10TB dataset and how to choose between Star, Snowflake, or OBT schemas.
The Strategic Problem Solver
You can identify "Dirty Data" in a legacy system and build the logic to clean and enrich it during migration.
Experience
8+ years in Data Engineering, with a proven track record of handling large-scale enterprise database migrations.
Why This Role is Critical at DuskByte
You are the one who makes "Intelligence" possible. Without your work, the AI/Automation team has nothing to process, and the Enterprise Modernization team has no foundation. You turn "Legacy Liability" into "Data Assets."
We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. By clicking "Accept All", you consent to our use of cookies. Cookie Policy