Employment Opportunity

Job Title: DataBricks Architect

Job Code: GW-12082313393930

Salary Range: $130K

Job Location:

City: Richardson State: TX

Job Description:

In this role, the Databricks Architect is responsible for providing technical direction 
and lead a group of one or more developer to address a goal. 



Architect and design solutions to meet functional and non-functional requirements.
Create and review architecture and solution design artifacts.
Evangelize re-use through the implementation of shared assets.  
Enforce adherence to architectural standards/principles, global product-specific 
guidelines, usability design standards, etc.  
Proactively guide engineering methodologies, standards, and leading practices.  
Guidance of engineering staff and reviews of as-built configurations during the 
construction phase.
Provide insight and direction on roles and responsibilities required for solution 
Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout 
the full lifecycle.
Considers the art of the possible, compares various architectural options based on 
feasibility and impact, and proposes actionable plans.
Demonstrate strong analytical and technical problem-solving skills.
Ability to analyze and operate at various levels of abstraction.  
Ability to balance what is strategically right with what is practically realistic. 
Qualifications we seek in you! 

Minimum qualifications 

Excellent technical architecture skills, enabling the creation of future-proof, complex 
global solutions.
Excellent interpersonal communication and organizational skills are required to operate as 
a leading member of global, distributed teams that deliver quality services and solutions.  
Ability to rapidly gain knowledge of the organizational structure of the firm to 
facilitate work with groups outside of the immediate technical team.  
Knowledge and experience in IT methodologies and life cycles that will be used.  
Familiar with solution implementation/management, service/operations management, etc.  
Leadership skills can inspire others and persuade.
Maintains close awareness of new and emerging technologies and their potential application 
for service offerings and products.  
Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or 
equivalent work experience  
Experience in a solution architecture role using service and hosting solutions such as 
private/public cloud IaaS, PaaS, and SaaS platforms. 
Experience in architecting and designing technical solutions for cloud-centric solutions 
based on industry standards using IaaS, PaaS, and SaaS capabilities.  
Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, 
Security, Monitoring, Governance
Must have experience to design platform on Databricks.
Hands-on Experience to design and build Databricks based solution on any cloud platform.
Hands-on experience to design and build solution powered by DBT models and integrate with 
Must be very good designing End-to-End solution on cloud platform.
Must have good knowledge of Data Engineering concept and related services of cloud.
Must have good experience in Python and Spark.
Must have good experience in setting up development best practices.
Intermediate level knowledge is required for Data Modelling.
Good to have knowledge of docker and Kubernetes.
Experience with claims-based authentication (SAML/OAuth/OIDC), MFA,  RBAC, SSO etc.  
Knowledge of cloud security controls including tenant isolation, encryption at rest, 
encryption in transit, key management, vulnerability assessments, application firewalls, 
SIEM, etc.  
Experience building and supporting mission-critical technology components with DR 
Experience with multi-tier system and service design and development for large enterprises  
Extensive, real-world experience designing technology components for enterprise solutions 
and defining solution architectures and reference architectures with a focus on cloud 
Exposure to infrastructure and application security technologies and approaches  
Familiarity with requirements gathering techniques.  

Preferred qualifications /Skills

Must have designed the E2E architecture of unified data platform covering all the aspect 
of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption.
Must have excellent coding skills either Python or Scala, preferably Python.
Must have good amount of experience in Data Engineering domain.
Must have designed and implemented at least 2-3 project end-to-end in Databricks.
Must have at least some experiences on databricks which consists of various components as 
Delta lake
db API 2.0
SQL Endpoint – Photon engine
Unity Catalog
Databricks workflows orchestration
Security management
Platform governance
Data Security
Must have followed various architectural principles to design best suited per problem.
Must be well versed with Databricks Lakehouse concept and its implementation in enterprise 
Must have strong understanding of Data warehousing and various governance and security 
standards around Databricks.
Must have knowledge of cluster optimization and its integration with various cloud 
Must have good understanding to create complex data pipeline.
Must be strong in SQL and sprak-sql.
Must have strong performance optimization skills to improve efficiency and reduce cost.
Must have worked on designing both Batch and streaming data pipeline.
Must have extensive knowledge of Spark and Hive data processing framework.  
Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, 
ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases.
Must be strong in writing unit test case and integration test.
Must have strong communication skills and have worked with cross platform team.
Must have great attitude towards learning new skills and upskilling the existing skills.
Responsible to set best practices around Databricks CI/CD.
Must understand composable architecture to take fullest advantage of Databricks 
Good to have Rest API knowledge.
Good to have understanding around cost distribution.
Good to have if worked on migration project to build Unified data platform.
Good to have knowledge of DBT.
Experience around DevSecOps including docker and Kubernetes.
Software development full lifecycle methodologies, patterns, frameworks, libraries, and 
Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, 
SQL,  Java, Python, etc.  
Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, 
Experience with visualization tools such as Tableau, Power BI
Experience with machine learning tools such as mlFlow, Databricks AI/ML, Azure ML, AWS 
sagemaker,  etc.  
Experience in distilling complex technical challenges to actionable decisions for 
stakeholders and guiding project teams by building consensus and mediating compromises 
when necessary.  
Experience coordinating the intersection of complex system dependencies and interactions  
Experience in solution delivery using common methodologies especially SAFe Agile but also 
Waterfall, Iterative, etc.  
Demonstrated knowledge of relevant industry trends and standards