Database Administrator

Application ends: August 10, 2025

Apply for this job

Email *

Job Description

About the Role

We are seeking a detail-obsessed Database Administrator to take ownership of our PostgreSQL and MongoDB environments that support both real-time analytics and transactional workflows. This is not a maintenance role — we’re looking for someone who wants to optimize, refactor, and challenge the status quo of how data is stored, secured, and accessed.

Key Responsibilities

  • Design and refactor schemas to support low-latency reads for our internal analytics tools and customer-facing dashboards.
  • Implement partitioning, materialized views, and advanced indexing strategies to support high-throughput operations (millions of rows daily).
  • Own and continuously improve our automated backup and restore systems, including point-in-time recovery, cross-region replication, and verification testing.
  • Work closely with SREs to deploy databases via Terraform and Helm into our Kubernetes clusters — you should know how to containerize a DB, not just connect to one.
  • Conduct regular query audits and recommend rewrites or changes to application-layer logic to reduce load.
  • Lead migration planning for legacy MySQL instances and contribute to deprecating unmaintainable schemas.
  • Ensure compliance with SOC 2, HIPAA, and GDPR requirements for data access, retention, and auditability.

Must-Have Experience

  • 4+ years of hands-on experience with PostgreSQL (including extensions like PostGIS, pg_stat_statements, and logical replication).
  • 2+ years with MongoDB in a production environment — including sharding, replica sets, and performance tuning.
  • Demonstrated ability to analyze and improve complex SQL queries and execution plans.
  • Deep understanding of ACID properties, MVCC, and isolation levels — you should be able to explain phantom reads and fix them.
  • Experience integrating IAM-based access control across DB clusters.
  • Proficient in scripting (Python or Bash) for automation and monitoring (preferably using Prometheus/Grafana).

Nice-to-Have

  • Experience with CDC tools like Debezium or Apache Kafka Connect.
  • Familiarity with columnar stores (e.g., ClickHouse or Redshift) and how to manage ingestion pipelines from OLTP sources.
  • Contributions to open-source database tooling or plugins.

Are you interested in this position?

Apply by clicking on the “Apply Now” button below!

#GraphicDesignJobsOnline#WebDesignRemoteJobs #FreelanceGraphicDesigner #WorkFromHomeDesignJobs #OnlineWebDesignWork #RemoteDesignOpportunities #HireGraphicDesigners #DigitalDesignCareers#Dynamicbrandguru