Position:1 Databricks Platform/DevOps Engineer - 1 position Assignment Duration: From now to end of Dec 2024 Job Description (JD): Experience: 5+ years Skills Required: Databricks/Azure admin/Azure DevOps Git/Git-Actions Terraform Experience in setting up CI/CD Experience in Azure services admin (ADF, Logic Apps, Blob, ADLS, Key-Vault, etc.) Experience in Databricks admin (Workspaces, Unity Catalog, Volumes, Ext volumes, etc.) ------------------------------------------------------------------------------------------------------------------------------------------------------ Position:2 Databricks Security/Compliance Engineer - 1 position Assignment Duration: From now to end of Dec 2024 Job Description (JD): Experience: 5+ years Skills Required: Databricks/Azure admin Responsibilities: Security Controls Implementation: Implement and maintain security controls across Azure Databricks/Delta Lake platform and data ingestion and orchestration platforms, including access controls, encryption, network security, and vulnerability management. Compliance Management: Monitor and ensure compliance with relevant industry standards, regulations (e.g., SOX, GDPR, HIPAA), and internal security policies. Risk Assessment: Conduct risk assessments on Azure cloud data platforms to identify potential vulnerabilities and threats. Provide recommendations and implement remediation measures to mitigate risks. Incident Response: Collaborate with incident response teams to investigate and respond to security incidents related to data platforms. Develop incident response plans and participate in incident response exercises. Security Audits and Assessments: Participate in security audits and assessments to evaluate the effectiveness of security controls and identify areas for improvement. Address findings and implement necessary changes. Work with internal and external auditors to provide evidence required for audit and compliance. Security Awareness and Training: Develop and deliver security awareness and training programs to educate employees on Azure cloud data platform security best practices. Documentation and Reporting: Maintain accurate documentation of security controls, policies, and procedures. Generate reports on security metrics, compliance status, and incidents for management and stakeholders. Security Strategy: Contribute to the development and execution of the organization's data platform security strategy. Stay updated on emerging threats and security technologies to recommend improvements. Regular Security Audits: Conduct regular security audits and participate in SOX compliance audits, providing reports and recommendations for enhancements.
Databricks Platform/DevOps Engineer - 1 position
More Relevant Posts
-
Hi Everyone, One of our client is looking for a Lead AWS Engineer for an onsite position in Charlotte, NC. Role: Lead AWS Engineer Location: Charlotte, NC Duration: 6-12 Months Mandatory Skills: 5+ years of designing, and developing utilizing common AWS services like EC2, S3, ECS, EKS, ELB, Any RDS, Streaming (Kafka), Messaging(SQS, ActiveMQ, RabbitMQ), Caching(Redis, Ignite, Redshift, elastic), NO-SQL and API Gateway 10+ years of proficiency in at least one language C#, .Net Core, and experience in full-stack development. 10+ years of experience working with RDBMS technologies like MSSQL, Oracle, or any other 5+ years experience with Docker and Kubernetes. 3+ years of experience in a lead role by overseeing the development team, delivering with quality, coaching the team with the best practices Kindly share any relevant profile you may have at Ankit@zemanticsllc.com or reach out to me directly at +1-732-782-5740 #AWS #EC2 #S3 #ECS #EKS #ELB #RDS #Kafka #SQS #ActiveMQ #RabbitMQ #Redis #Ignite #Redshift #APIGateway #CSharp #DotNetCore #FullStack #MSSQL #Oracle #Docker #Kubernetes #LeadEngineer #CharlotteNC #CloudComputing #DevOps #TechLeadership #SoftwareDevelopment #NoSQL #AWSArchitecture #AWSDevelopment #TeamLeadership #BestPractices #AWSIntegration #AWSManagement #CloudServices #AWSExpert #AWSJobs #AWSCommunity #AWSConsulting #CloudInfrastructure #CloudSolutions #AWSMigration #Containerization #ContinuousIntegration #ContinuousDelivery #AWSNetworking #DataManagement #DatabaseAdministration #SoftwareEngineering #Microservices #Serverless #Lambda #IAM #InfrastructureAsCode #CloudFormation #CloudWatch #ElasticBeanstalk #VPC #Route53 #CloudSecurity #BigData #MachineLearning #AI #IoT #EdgeComputing #MultiCloud #HybridCloud #CostOptimization #HighAvailability #Scalability #PerformanceTuning #Monitoring #Logging #Troubleshooting #Compliance #Automation #AgileDevelopment #CodeReview #ReleaseManagement #QualityAssurance #TechnicalDocumentation #Collaboration #ProblemSolving #Innovation #CloudStrategy #DigitalTransformation #AWSCertified #CloudNative #ContainerOrchestration #InfrastructureAutomation #CloudEngineering #CloudMigration #CloudGovernance #CloudArchitecture #AWSReInvent
To view or add a comment, sign in
-
Hello Connections.......! Hope you are doing well !!! One of our Client (Level 5) looking for #GCPDevops/ #DataStream Role : GCP Devops/DataStream Exp : 8-14 Years Notice Period : Immediate -30 Days location : Pan India MOH : Permanent ✅Job Description: 💥Mandatory Skills: Azure CI/CD pipeline, GCP Cloud, Terraform, Kubernetes, Dataflow, Big Query, GCP DataStream. Azure Devops (CI/CD pipelines), GCP Cloud, Terraform, Docker, Kubernetes, GitHub actions, scripting,ADO, Jenkins, GitHub. 🔅Key Responsibilities: ✔Automate GCP Infrastructure: Design and implement Terraform scripts to provision and automate GCP projects, environments, and data engineering services. ✔Set Up Infrastructure for GCP DataStream: Configure and deploy the infrastructure for GCP DataStream to support real-time data replication and Change Data Capture (CDC) from on-prem databases like Oracle and *DB2* to BigQuery*. infrastructure for *BigQuery, *Dataflow*, Pub/Sub, Cloud Storage, and other data engineering tools on GCP. ✔Automate Infrastructure Pipelines with Azure DevOps*: Design, develop, and maintain CI/CD pipelines using Azure DevOps to automate the deployment of Terraform scripts and manage infrastructure changes. ✔Optimize Cloud Infrastructure*: Ensure high availability, scalability, and security for cloud resources and data pipelines. ✔Security and Compliance*: Implement best practices for security and compliance across GCP environments, including IAM policies, encryption, and network security. ✔Collaboration: Work closely with data engineers and DevOps teams to align infrastructure with data pipelines and operational requirements. 🔅Required Skills and Experience: ✔*5+ years of experience in GCP infrastructure automation using Terraform to provision cloud resources and environments. ✔Experience with GCP DataStream*: Proven ability to set up and configure GCP DataStream for real-time data replication and CDC from on-prem databases (Oracle, DB2) to BigQuery. ✔Strong expertise in GCP data engineering services: Hands-on experience with BigQuery, Dataflow*, Pub/Sub, Cloud Storage, and related GCP services. ✔Infrastructure Automation: Proficiency in creating reusable Terraform modules and automating cloud infrastructure using Terraform. ✔Experience with Azure DevOps: Expertise in configuring and maintaining CI/CD pipelines for infrastructure provisioning using Azure DevOps*. ✔Networking and Security*: Strong understanding of GCP networking, IAM, VPCs, firewall configurations, and security best practices. ✔Familiarity with on-prem databases such as Oracle and *DB2*. ✔Scripting and Automation*: Proficiency in scripting (e.g., Bash, Python) for cloud automation and troubleshooting. If you are interested , kindly share your Updated CV at siva@burgeonits.net BURGEON IT SERVICES Burgeon IT Jobs Suman Lakkakula Rajkiran G Khaja Mansoor Ahmed Srikanth Vicky Thanks & Regards B.Siva Kumar Technical Recruiter Burgeon IT Services Email Id: siva@burgeonits.net
To view or add a comment, sign in
-
Hello Connections.......! Hope you are doing well !!! One of our Client (Level 5) looking for #GCPDevops/ #DataStream Role : GCP Devops/DataStream Exp : 8-14 Years Notice Period : Immediate -30 Days location : Pan India MOH : Permanent ✅Job Description: 💥Mandatory Skills: Azure CI/CD pipeline, GCP Cloud, Terraform, Kubernetes, Dataflow, Big Query, GCP DataStream. Azure Devops (CI/CD pipelines), GCP Cloud, Terraform, Docker, Kubernetes, GitHub actions, scripting,ADO, Jenkins, GitHub. 🔅Key Responsibilities: ✔Automate GCP Infrastructure: Design and implement Terraform scripts to provision and automate GCP projects, environments, and data engineering services. ✔Set Up Infrastructure for GCP DataStream: Configure and deploy the infrastructure for GCP DataStream to support real-time data replication and Change Data Capture (CDC) from on-prem databases like Oracle and *DB2* to BigQuery*. infrastructure for *BigQuery, *Dataflow*, Pub/Sub, Cloud Storage, and other data engineering tools on GCP. ✔Automate Infrastructure Pipelines with Azure DevOps*: Design, develop, and maintain CI/CD pipelines using Azure DevOps to automate the deployment of Terraform scripts and manage infrastructure changes. ✔Optimize Cloud Infrastructure*: Ensure high availability, scalability, and security for cloud resources and data pipelines. ✔Security and Compliance*: Implement best practices for security and compliance across GCP environments, including IAM policies, encryption, and network security. ✔Collaboration: Work closely with data engineers and DevOps teams to align infrastructure with data pipelines and operational requirements. 🔅Required Skills and Experience: ✔*5+ years of experience in GCP infrastructure automation using Terraform to provision cloud resources and environments. ✔Experience with GCP DataStream*: Proven ability to set up and configure GCP DataStream for real-time data replication and CDC from on-prem databases (Oracle, DB2) to BigQuery. ✔Strong expertise in GCP data engineering services: Hands-on experience with BigQuery, Dataflow*, Pub/Sub, Cloud Storage, and related GCP services. ✔Infrastructure Automation: Proficiency in creating reusable Terraform modules and automating cloud infrastructure using Terraform. ✔Experience with Azure DevOps: Expertise in configuring and maintaining CI/CD pipelines for infrastructure provisioning using Azure DevOps*. ✔Networking and Security*: Strong understanding of GCP networking, IAM, VPCs, firewall configurations, and security best practices. ✔Familiarity with on-prem databases such as Oracle and *DB2*. ✔Scripting and Automation*: Proficiency in scripting (e.g., Bash, Python) for cloud automation and troubleshooting. If you are interested , kindly share your Updated CV at siva@burgeonits.net BURGEON IT SERVICES Burgeon IT Jobs Suman Lakkakula Rajkiran G Khaja Mansoor Ahmed Srikanth Vicky Thanks & Regards B.Siva Kumar Technical Recruiter Burgeon IT Services Email Id: siva@burgeonits.net
To view or add a comment, sign in
-
Hello Connections.......! Hope you are doing well !!! One of our Client (Level 5) looking for #GCP #Devops/ #DataStream Role : GCP Devops/DataStream Exp : 8-14 Years Notice Period : Immediate -15 Days location : Pan India MOH : Permanent ✅Job Description: ✨Mandatory Skills: GCP Cloud, Terraform, Kubernetes, Dataflow, Big Query, GCP DataStream, Docker, Kubernetes, GitHub actions, any scripting . 🔅Key Responsibilities: ✔Automate GCP Infrastructure: Design and implement Terraform scripts to provision and automate GCP projects, environments, and data engineering services. ✔Set Up Infrastructure for GCP DataStream: Configure and deploy the infrastructure for GCP DataStream to support real-time data replication and Change Data Capture (CDC) from on-prem databases like Oracle and *DB2* to BigQuery*. ✔Provision and Manage the GCP Data Engineering Stack: Set up and optimize infrastructure for *BigQuery, *Dataflow*, Pub/Sub, Cloud Storage, and other data engineering tools on GCP. ✔Automate Infrastructure Pipelines with Azure DevOps*: Design, develop, and maintain CI/CD pipelines using Azure DevOps to automate the deployment of Terraform scripts and manage infrastructure changes. ✔Optimize Cloud Infrastructure*: Ensure high availability, scalability, and security for cloud resources and data pipelines. ✔Security and Compliance*: Implement best practices for security and compliance across GCP environments, including IAM policies, encryption, and network security. ✔Collaboration: Work closely with data engineers and DevOps teams to align infrastructure with data pipelines and operational requirements. 🔅Required Skills and Experience: ✔5+ years of experience in GCP infrastructure automation using Terraform to provision cloud resources and environments. ✔Experience with GCP DataStream*: Proven ability to set up and configure GCP DataStream for real-time data replication and CDC from on-prem databases (Oracle, DB2) to BigQuery. ✔Strong expertise in GCP data engineering services: Hands-on experience with BigQuery, Dataflow*, Pub/Sub, Cloud Storage, and related GCP services. ✔Infrastructure Automation: Proficiency in creating reusable Terraform modules and automating cloud infrastructure using Terraform. ✔Experience with Azure DevOps: Expertise in configuring and maintaining CI/CD pipelines for infrastructure provisioning using Azure DevOps*. ✔Networking and Security*: Strong understanding of GCP networking, IAM, VPCs, firewall configurations, and security best practices. ✔Familiarity with on-prem databases such as Oracle and *DB2*. Scripting and Automation*: Proficiency in scripting (e.g., Bash, Python) for cloud automation and troubleshooting. 📩If you are interested , kindly share your Updated CV at siva@burgeonits.net BURGEON IT SERVICES Burgeon IT Jobs Suman Lakkakula Rajkiran G Khaja Mansoor Ahmed Srikanth Vicky Thanks & Regards B.Siva Kumar Technical Recruiter Burgeon IT Services Email Id: siva@burgeonits.net
To view or add a comment, sign in
-
Role: AWS Aurora Consultant Location : Remote-US Type: Contract 1. The resource should analyze the instance and cluster of AWS Aurora and decide whether the extent and sizes defined and used for tables and indexes are optimal. 2. The resource should examine the long-running queries and check the indexes, joins, and statistics. 3. The resource should scrutinize the current organization and work with the application teams on storage strategies such as partitions, segmentations, and split-merge tables. 4. The resource should estimate the instance size for optimal performance of application and how the size and config should change as DB grows to support performance. Mandatory Skillsets: • AWS Expertise (AWS Aurora and other DB services) • Postgres Expertise • Performance Tuning • DynamoDB #benchsales #benchlist #hotlist #hiring #vendorlist #AWS #AWSAurora #DB #database
To view or add a comment, sign in
-
🚀We're Hiring: AWS Neptune Developer🚀 💼 Experience: 3-5Yrs 📍 Location: Remote 💰 Budget: 25LPA Notice period - immediately only Design and implement graph database solutions on AWS Neptune to meet application needs. Develop and optimize Gremlin and SPARQL queries for efficient data retrieval and manipulation. Handle data migration from on-premises systems (e.g., Neo4j) to AWS Neptune. Collaborate with cloud architects to design scalable and secure AWS cloud solutions. Develop data transformation utilities for property graph and RDF data formats. Implement security controls for access management, user roles, and encryption on AWS Neptune. Monitor and troubleshoot AWS Neptune clusters and associated services. Enhance existing applications to utilize Neptune’s graph capabilities effectively. Work with cloud-native tools like AWS Lambda, S3, and SQS for integrations and workflows. Prepare documentation for architecture, development, and operational processes. Qualifications: Strong understanding of AWS Neptune architecture and graph database principles. Proficiency in Gremlin and SPARQL query languages. Hands-on experience with data migration tools and cloud services such as AWS DMS. Familiarity with AWS services like Lambda, EC2, IAM, and CloudFormation. Experience with containerization tools like Docker and orchestration platforms like Kubernetes. Knowledge of database optimization, indexing, and performance monitoring. Strong understanding of RESTful APIs and event-driven frameworks. Excellent communication and collaboration skill 👉 Interested? 📞 Contact: [9581834154] 💌 Email: [chandrika@axzorait.com] #AWSNeptune #GraphDatabase #Gremlin #SPARQL #CloudSolutions #DataMigration #Neo4j #AWSLambda #AWSS3 #AWSSQS #AWSCloud #CloudArchitecture #DataTransformation #GraphQueries #RDF #PropertyGraph #AWSCloudServices #Containerization #Docker #Kubernetes #AWSCloudFormation #EC2 #IAM #APIIntegration #EventDriven #PerformanceOptimization #DatabaseDesign #CloudIntegration #AccessManagement #DataSecurity #AWSNeptuneMonitoring #CloudDevelopment #CloudCollaboration #TechDocumentation #GraphCapabilities
To view or add a comment, sign in
-
Hello Network, Please check my updated hotlist and reach me on praveen@crestsoftwareinc.com; Direct: 201-844-6557 Skill set available : #java #dataengineer #frontend #UI #devops #azure #cloud #sccm #salesforcedeveloper #businessanalyst #c2c #hotlist
To view or add a comment, sign in
-
-
Hello Network, Please check my updated hotlist and reach me on praveen@crestsoftwareinc.com; Direct: 201-844-6557 Skill set available : #java #dataengineer #frontend #UI #devops #azure #cloud #sccm #salesforcedeveloper #businessanalyst #c2c #hotlist
To view or add a comment, sign in
-