Build With Us


Senior Platform Engineer



Software Engineering
India · Remote
Posted on Tuesday, March 12, 2024

Introduction to Demandbase:

Demandbase is the Smarter GTM™ company for B2B brands. We help B2B companies hit their revenue goals using fewer resources. How? By aligning their sales and marketing teams around a combination of their data, our data, and artificial intelligence — what we call Account Intelligence — so they can identify, engage, and focus their time and money on the accounts most likely to buy.

As a company, we’re as committed to growing careers as we are to building world-class technology. We invest heavily in people, our culture, and the community around us. We have offices in the San Francisco Bay Area, Seattle, and India, as well as a team in the UK, and allow employees to work remotely. We have also been continuously recognized as one of the best places to work in the San Francisco Bay Area including, “Best Workplaces for Millennials” and “Best Workplaces for Parents”!

We're committed to attracting, developing, retaining, and promoting a diverse workforce. By ensuring that every Demandbase employee is able to bring a diversity of talents to work, we're increasingly capable of living out our mission to transform how B2B goes to market. We encourage people from historically underrepresented backgrounds and all walks of life to apply. Come grow with us at Demandbase!

About the Role:

We are seeking a talented and experienced Platform Engineer to join our dynamic team. As a Platform Engineer, you will play a crucial role in designing, implementing, and maintaining our cloud infrastructure and deployment pipelines. You will work closely with development teams to streamline the software delivery process, optimize performance, and ensure the reliability and scalability of our applications.

What you'll be doing:

  • Design, deploy, and manage Kubernetes clusters to orchestrate containerized applications.
  • Develop and maintain CI/CD pipelines to automate software build, test, and deployment processes.
  • Utilize Terraform for infrastructure as code to provision and manage AWS resources.
  • Implement and manage Airflow workflows for data pipeline orchestration and scheduling.
  • Configure and maintain Istio service mesh for microservices communication, traffic management, and security.
  • Collaborate with development teams to optimize application performance, scalability, and reliability.
  • Monitor and troubleshoot production systems, ensuring high availability and performance.
  • Implement security best practices and compliance standards across the infrastructure.
  • Provide support and guidance to development teams on infrastructure-related issues and best practices.
  • Document infrastructure configurations, processes, and procedures.
  • Empowering development teams to clear operational hurdles by setting best practices, creating reusable templates and modules, and writing clear documentation in a quickly evolving “shift left” culture.

About You:

  • You are a strong analytical and problem solving skills
  • You are self-motivated learner
  • You are eager to learn new technologies
  • You are receptive to constructive feedback
  • You are confident and articulate with excellent written and verbal communication skills
  • You are thrive in an environment that’s fast paced and focused on “shifting left”

Skills & Education:

  • Bachelor’s or master’s degree in computer science, Mathematics, Statistics from a top engineering institution.
  • 6+ years of experience in a similar role, with a strong focus on cloud infrastructure and DevOps practices.
  • Hands-on experience with Kubernetes, including cluster setup, configuration, and management.
  • Proficiency in building and maintaining CI/CD pipelines using tools like Jenkins, GitLab CI/CD, or similar.
  • Solid understanding of infrastructure as code principles and experience with Terraform.
  • Experience with AWS services, including EKS, S3, RDS, Lambda, IAM, etc.
  • Familiarity with Airflow for workflow orchestration and scheduling.
  • Knowledge of Istio or similar service mesh technologies for microservices architecture.
  • Strong scripting skills in languages such as Bash, Python, or similar.
  • Experience with monitoring and logging tools like Prometheus, Grafana, Datadog etc.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills, with a strong product mindset.
  • Nice to have experience in designing/implementing ETL data pipelines using open source platforms.
  • Nice to have experience with other cloud providers such as Azure or Google Cloud Platform.

Our Commitment to Diversity, Equity, and Inclusion at Demandbase

At Demandbase, we believe in creating a workplace culture that values and celebrates diversity in all its forms. We recognize that everyone brings unique experiences, perspectives, and identities to the table, and we are committed to building a community where everyone feels valued, respected, and supported. Discrimination of any kind is not tolerated, and we strive to ensure that every individual has an equal opportunity to succeed and grow, regardless of their gender identity, sexual orientation, disability, race, ethnicity, background, marital status, genetic information, education level, veteran status, national origin, or any other protected status. We do not automatically disqualify applicants with criminal records and will consider each applicant on a case-by-case basis.

We recognize that not all candidates will have every skill or qualification listed in this job description. If you feel you have the level of experience to be successful in the role, we encourage you to apply!

We acknowledge that true diversity and inclusion require ongoing effort, and we are committed to doing the work required to make our workplace a safe and equitable space for all. Join us in building a community where we can learn from each other, celebrate our differences, and work together.

Personal information that you submit will be used by Demandbase for recruiting and other business purposes. Our Privacy Policy explains how we collect and use personal information.