Job Details
Sorry, this job is not active!
Software Engineer Senior
Job Code Number
212530877
Job Description
V2Soft (www.v2soft.com) is a global company, headquartered out of Bloomfield Hills, Michigan, with locations in Mexico, Italy, India, China and Germany. At V2Soft, our mission is to provide high performance technology solutions to solve real business problems. We become our customer’s true partner, enabling both parties to enjoy success. We are committed to promoting diversity in the workplace, and believe it has a positive effect on our company and the customers we serve.
Key Responsibilities:
- Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of Data Platform Implement methods for automation of all parts of the pipeline to minimize labor in development and production.
- Identify, develop, evaluate, and summarize Proof of Concepts to prove out solutions
- Test and compare competing solutions and report out a point of view on the best solution
- Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP
- Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: o BigQuery, DataFlow (Apache Beam), Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer (Apache Airflow), Cloud SQL, Compute Engine, Cloud Functions, and App Engine
- Migrate existing Big Data pipelines into Google Cloud Platform . Build new data products in GCP.
Skills Required:
- Minimum 3 Years of Experience in Java/python in-depth Minimum
- 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex sql queries.
- Minimum 5 Years of GCP experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc
- Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata
- 1 Year experience of deploying google cloud services using Terraform
Skills Preferred:
- Understands Cloud as being a way to operate and not a place to host systems Understands data architectures and design independent of the technology
- Experience with Python, Shell Script preferred Exceptional problem solving and communication skills and management of multiple stakeholders
- Experience in working with Agile and Lean methodologies Experience with Test-Driven Development
Experience Required:
- Minimum 3 Years of Experience in Java/python in-depth Minimum
- 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex sql queries.
- Minimum 5 Years of GCP experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc
- Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata
- 1 Year experience of deploying google cloud services using Terraform
Education Required:
- Bachelors or masters in required field
V2Soft is an Equal Opportunity Employer ( EOE).
https://www.v2soft.com/careers - to view all of our open opportunities and to learn more about our benefits.