JUMP TO CONTENT

8th Floor, AP 81, Sr. No. 83, North Main Road, Near Hard Rock Café, Mundhwa, Pune

  1. Pune
  2. Information Technology

GCP Engineer

Job description

Company Description

Metro Global Solution Center (MGSC) is internal solution partner for METRO, a €29.8 Billion international wholesaler with operations in 31 countries through 661 stores & a team of 93,000 people globally. Metro operates in a further 10 countries with its Food Service Distribution (FSD) business and it is thus active in a total of 34 countries.

MGSC, location wise is present in Pune (India), Düsseldorf (Germany) and Szczecin (Poland). We provide IT & Business operations support to 31 countries, speak 24+ languages and process over 18,000 transactions a day. We are setting tomorrow’s standards for customer focus, digital solutions, and sustainable business models. For over 10 years, we have been providing services and solutions from our two locations in Pune and Szczecin. This has allowed us to gain extensive experience in how we can best serve our internal customers with high quality and passion. We believe that we can add value, drive efficiency, and satisfy our customers.

Website: https://www.metro-gsc.in
Company Size: 600-650
Headquarters: Pune, Maharashtra, India
Type: Privately Held
Inception:  2011


Job Description

As a GCP Developer, you are a full-stack GCP engineer who loves solving business problems with GCP services. You work with business leads, analysts and data scientists to understand the business domain and engage with fellow engineers to build innovative data products for the business. You are passionate about data quality of our business metrics and flexibility of your solution that scales to respond to broader business questions.


Qualifications

Must have Skills:

  • Strong knowledge in GCP Services – Like Big Query, SQL, Cloud SQL, Big Data.
  • Understand the business problem and translate these to data services and engineering outcomes.
  • Should know data modeling and SQL concepts to manage and manipulate data.
  • Collaborate with many teams - engineering and business, to build better data products.
  • Hands-on experience of Python programming language.
  • Consistently strive to acquire new skills on Cloud, DevOps, Big Data, AI and ML.
  • Should be proficient in understanding and familiar with Cloud Vertex AI (UI & SDK).
  • Must know ETL and building data pipeline using Google APIs.
  • Adhere standard git practices and capable of implementing CI/CD pipeline.

Good To have Skills:               

  • NLP Skills with text processing, encoders and experience in building and optimizing search engines.
  • Ability to optimize existing model.
  • Skilled in prompt engineering & LLM models.
  • GCP certification in Professional Data Engineer is good to have.

Tech Skills:

  • Python programming.
  • SQL Skills, Big Query with data warehousing experience.
  • Strong knowledge in GCP
  • Should know use of IDE like VS code, Pycharm.

Additional Information

  • Good communication to express his/her thoughts.
  • Experience in working independently and strong analytical skills.
  • Desire to learn and work with new technologies.
  • Commitment & responsible towards the work.
  • Provide technical expertise and guidance to junior team members.
List #1

Articles you might be interested in

  1. Pune

Browse Jobs