Role: Azure Databricks Engineer
Location: Jersey City, NJ - Hybrid (2 days Onsite)
NOTE :: Must be available for face to face for screening and final round of interview
Responsibilities:
1. Design and implement a scalable data warehouse on Azure Databricks using data and dimensional modeling techniques to support analytical and reporting requirements.
2. Develop and optimize ETL/ELT pipelines using Python, Azure Databricks and PySpark for large-scale data processing, ensuring data quality, consistency, and integrity.
3. Establish and implement best practices for data ingestion, transformation, and storage using the medallion architecture (Bronze, Silver, Gold).
4. Architect and develop highly scalable data applications using Azure Databricks and distributed computing.
5. Optimize Databricks clusters and ETL/ELT workflows for performance and scalability.
6. Manage data storage solutions using Azure Data Lake Storage (ADLS) and Delta Lake while leveraging Unity Catalog for data governance, security, and access control.
7. Develop and schedule Databricks notebooks and jobs for automated daily execution, implementing monitoring, alerting, and automated recovery processes for job failures.
8. Identify and resolve bottlenecks in existing code and follow best coding practices to improve performance and maintainability.
9. Use GitHub as version control tool to manage code and collaborate effectively with other developers; build and maintain CI/CD pipelines for deployment and testing using Azure DevOps and GitHub.
10. Create comprehensive documentation for data architecture, ETL processes, and business logic.
11. Work closely with business stakeholders to understand project goals and architect scalable and efficient solutions.
12. Knowledge of user authentication on Unity Catalog and authorization between multiple systems, servers and environments.
13. Ensure that programs are written to the highest standards (e.g., Unit Tests) and technical specifications.
14. Ability to collaborate on projects and work independently when required.
Qualifications:
1. 10+ years of prior experience as a developer in the required technologies (Azure Databricks, Python, PySpark, Datawarehouse Designing)
2. Solid organizational skills, ability to multi-task across different projects
3. Experience with Agile methodologies
4. Skilled at independently researching topics using all means available to discover relevant information.
5. Ability to work in a team environment.
6. Excellent verbal and written communication skills
7. Self-starter with ability to multi-task and to maintain momentum.
PACU Registered Nurse (RN) - Travel Position Location: Bismarck, ND Job Type: Travel Contract Duration: 13 weeks Shift: 8hrs & 12hrs (mixed) Pay: $2325.60/week Job Description: We are seeking a compassionate and skilled PACU Registered...
...We are seeking a compassionate and dedicated Behavior Specialist in Dayton, OH. The ideal candidate will support individuals with behavioral challenges by creating and implementing behavior intervention plans, working closely with clients, families, and support staff...
...Discount Professionalism Productive Environment Strong company culture Delicious food in a beautiful restaurant! The Prep Cook is one of the most important positions we have in the kitchen to ensure recipe adherence, quality control, and overall sanitation....
...Responsibilities Valley Hospital Medical Center Located in the heart of Las Vegas, Valley Hospital Medical Center is an acute care and teaching hospital that has provided high quality healthcare to residents of Southern Nevada since 1972. The hospital offers a comprehensive...
Join the Lovisa Team Where Fashion and Fun Come Together! Ready to dive into the world of super stylish jewellery that doesnt break the bank? Lovisa is the ultimate go-to destination for trendy, affordable jewellery, and were growing fast and we want YOU to...