Pitney Bowes Jobs

Job Information

Pitney Bowes Advisory Data Engineer in Sector 62, Noida, India

At Pitney Bowes, we do the right thing, the right way. As a member of our team, you can too.

We have amazing people who are the driving force, the inspiration and foundation of our company. Our thriving culture can be broken down into four components: Client. Team. Win. Innovate.

We actively look for prospects who:

• Are passionate about client success.

• Enjoy collaborating with others.

• Strive to exceed expectations.

• Move boldly in the quest for superior and best in market solutions.

Job Description:

Join Pitney Bowes as Advisory Data Engineer

Years of experience: 68 years

Job Location – Noida


As a Advisory Data Engineer you will be part of the Parcel Tracking Services that is one of the Core Domain services in Pitney Bowes and maintains mission critical enterprise applications. You would be responsible for building logical and physical data models using tools, create & Maintain snowflake warehouse in collaboration with database engineers. You will be responsible for conducting modeling sessions with project team, gather and define data requirements for Enterprise Data Model. You will also be responsible for developing and regularly training machine learning models for various tracking related scenarios. You are expected to be hands on and up to date with the latest technology stack.

The Job

  • Devise and utilize algorithms and models to mine big data stores, perform data and error analysis to improve models, and clean and validate data for uniformity and accuracy

  • Analyze data for trends and patterns, and Interpret data to drive optimization and improvement of product and propose solutions and strategies to business challenges

  • Use Data Visualization techniques to Communicate analytic solutions to stakeholders

  • Selecting features, building, and optimizing classifiers using machine learning techniques

  • Implement analytical models into production by collaborating with software developers

  • Develop processes and tools to monitor and analyze model performance and data accuracy.

  • Conduct modeling sessions with team, gather and define data requirements for Enterprise Data Model.

  • Lead discussion with stakeholders designed to transform data requirements into a robust, canonical logical model and data dictionary, with clearly-written business definitions for logical entities and attributes that describe enterprise data in a way that is consumable by a wide business audience.

  • Collaborate with database engineers to Create logical and physical data models using tools, create & Maintain snowflake warehouse.

  • Provide and enforce best practices and governance standards, normalization, data classification, and normalization in the delivery of high-quality data solutions for Enterprise Data Warehouse, business group focused

  • Good hands-on experience on batch processing and streaming data/Analytics

  • Ability to work in an agile model with changing needs and priorities

  • Demonstrate Ownership and accountability

  • Ability to work as a team and individually for troubleshooting/investigating problems.

Required Qualifications & Skills

This role requires a talented self-directed individual with a strong work ethic and the following skills:


  • Experience with Cloud BI Platform – Snowflake

  • Experience using data visualization tools - Power BI, Tableau, DataApps.

  • Experience in writing ad-hoc SQL queries (Microsoft SQL, SnowSQL )

  • Experience in creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.

  • Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest, Boosting etc

  • Experience in using Jupyter notebooks for data visualization and machine learning model development using scripting language Python

  • Advanced Excel skills, including functions and pivot tables

  • Experience with cloud (AWS).

  • Experience with data quality check tools like Monte Carlo

Good to have

  • Experience with event streaming concepts (Kafka, Prometheus)

  • Knowledge of CI\CD pipeline architecture for training and deploying machine learning models

  • Knowledge of Kubernetes and docker would be an added advantage.

  • Agile development methodologies including Scrum

  • Knowledge of Azure develops would be an added advantage.

  • Familiarity with Atlassian toolset (JIRA, Confluence) a plus

Qualification & work experience

Bachelor's or Master’s Degree in Statistics, Math, Computer Science, Management Information Systems, Data Analytics or another quantitative discipline, or equivalent work experience

We will:

• Provide the will: opportunity to grow and develop your career

• Offer an inclusive environment that encourages diverse perspectives and ideas

• Deliver challenging and unique opportunities to contribute to the success of a transforming organization

• Offer comprehensive benefits globally ( pbprojectliving.com )

Pitney Bowes is an equal opportunity employer that values diversity and inclusiveness in the workplace.

All interested individuals must apply online.

Pitney Bowes is an Equal Employment Opportunity/Affirmative Action Employer that values diversity and inclusiveness in the workplace.

Women/Men/Veterans/Individuals with Disabilities/LGBTQ are encouraged to apply.

All interested individuals must apply online. Individuals with disabilities who cannot apply via our online application should refer to the alternate application options via our Individuals with Disabilities link.