Azure Cloud Data Engineer Engineering - Vienna, VA at Geebo

Azure Cloud Data Engineer

Azure Cloud Data Engineer Location:
Currently REMOTE but MUST be local to Vienna, VA Pay Rate:
Open to Both C2C and W2 options Position Type:
Multiyear ContractBasic Purpose:
Develop technical solutions for data acquisition, data integration, and data sharing with Client's Omni-Channel, digital platforms in Near Real-Time and batch.
Responsible for engineering, designing, building, integrating data from various batch, streaming, edge applications into high performing operational Hubs.
Develop complex event driven applications with the goal of optimizing the performance of Client's messaging/event driven data ecosystem.
Recognized as an expert with a specialized depth and/or breadth of expertise in discipline.
Solves highly complex problems; takes a broad perspective to identify solutions.
Leads functional teams or projects.
Works independently.
Responsibilities:
Design and build highly scalable data pipelines for near real-time and batch data ingestion, processing, and data integration Technical leadership and knowledge to provide technical guidance and educate team members and coworkers on development and operations of streaming and event driven applications Be a hands-on mentor and advocate to ensure successful adoption of new tools, processes, and best practices across the organization Recognize potential issues and risks during the project implementation and suggest mitigation strategies Communicate and own the process of manipulating and merging large datasets Expert and key point of contact between the API teams and the project/functional leads Work directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectives Prepare advanced project implementation plans which highlight major milestones and deliverables, leveraging standard methods and work planning tools Lead the preparation of high-quality project deliverables that are valued by the business and present them in such a manner that they are easily understood by project stakeholders Perform other duties as assigned Qualifications and Education Requirements:
Degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training and experience Working knowledge of message-oriented middleware/streaming data technologies such as Kafka/NiFi, MQ, Azure Event Hub Must have strong programming skills / experience in C# /.
NET, Logic App Must have strong programming skills / experience in Azure Functions using various protocols /triggers, Git/Github Hands-on experience in configuring Azure Event Hub, Event Grid, Stream Analytics, logic/function apps and JSON Expert level skills in Python, Databricks, Azure Data Factory Experienced in the use of ETL tools and techniques and have knowledge of CI/CD Experience & Expertise in cloud NoSQL databases, ideally Azure/Azure Data Services/Cosmos DB or equivalent Knowledge of and experience scaling one or more of the popular data streaming and processing technologies like Kafka, Spark, Spark Streaming, etc.
General knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment Demonstrates change management and excellent communication skills For immediate consideration, please apply directly or contact Ryan Pustilnik at 301.
740.
2110.
Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.