Are you a passionate technologist eager to build impactful solutions in a collaborative environment? At BMC Control-M’s DataOps team, we're developing a next-generation data quality solution that ensures reliable, secure, and actionable insights for enterprise systems.
BMC is looking for a skilled Python Developer who’s excited to work with modern data platforms and solve complex, distributed software challenges. If you thrive in a fast-paced, innovative setting and enjoy turning ideas into scalable solutions, let’s talk.
In this role, you will:
- Design and develop features for a new add-on to an industry-leading product.
- Contribute to project design with a focus on scalability, reliability, and performance.
- Enhance existing features and develop new ones, including bug fixes and improvements in complex code areas.
- Troubleshoot and resolve complex technical issues in both development and production environments.
- Research and evaluate technologies/tools to support the product vision and roadmap.
What you’ll need to fit the role:
- Bachelor’s degree in Computer Science, Software Engineering, or a related field.
- 3+ years of hands-on experience in Python development, ecosystem, and tools. Java knowledge is a plus!
- Strong proven experience in Linux and Windows operating systems
- Strong proficiency in SQL and relational database systems.
- Experience developing and integrating REST APIs.
- Solid understanding of object-oriented design and software engineering principles.
- Proactive, self-driven mindset with a strong problem-solving orientation.
- Excellent communication and collaboration skills.
It would be an advantage for you to have:
- Previous knowledge in Python packaging and software distribution.
- Comfortable working in an Agile development environment with strong testing practices.
- Familiarity with DataOps practices, data modeling and data warehousing concepts.
- Experience with relational SQL and NoSQL databases and Data Warehouses (Postgres, Snowflake).
- Experience with GraphQL.
- Experience with distributed computation engines (ex. Apache Spark, OpenMetadata, DuckDB, etc.)
- Knowledge of security best practices (certificates, encryption).
- Experience with containerization tools (Docker, Kubernetes, Helm).
- Hands-on work with AWS services (e.g., Lambda, S3, EMR).