beBee background
Professionals
>
Upper Management / Consulting
>
San Francisco
kshitija gupte

Social


About kshitija gupte:

An accomplished Senior Data Scientist with 16 years of experience in the data science industry. Possesses extensive domain expertise in B2B space, supply chains, bank alliances, fintech products, healthcare, insurance, finance practices, and technology like IBM mainframes, DB2, COBOL, JCL, HBR, TBR, etc. Has worked with top-tier companies like Infosys and PricewaterhouseCoopers and has expertise in GCP and Azure backed with databases and front-end of analytics platforms. Proficient in using tools like SQL, Tableau, Domo, Postgres, AWS, Zeppelin, Python, R, etc.

Experience

KJ Gupte 

 

Professional Summary:

  • Senior Data Scientist at Tradeshift Inc. with expertise in Data Analytics and visualization. Demonstrated ability to create informative dashboards and reporting tools on Big Data platforms like Tableau, QlikView, and DOMO, with a result-driven focus and big-picture approach. 

 

  • Experience in data management and data analysis in a relational database, Hadoop, AWS, Postgres, GCP, Azure and Spark SQL hands-on experience.
  • Experience in large-scale data science projects and delivering from end to end.
  • Experience with building data pipelines, model artifacts, and packaging for and model deployment.
  • Strong communication and interpersonal influencing skills.
  • Excellent problem-solving and critical thinking capabilities

 

  • Core Competencies include Project Management, Data Analysis, Data Modelling, Product Management, Content Management, Agile Methodologies, Business Analysis User Interface Design, Process Mapping, Data Manipulation 

 

  • Technical Expertise Databases: DB2, MSSQL, Oracle, MySQL, MS Access, MS Excel (Advanced) Data Analysis

 

  • Visualization Tools: Tableau, DOMO, QlikView, Pentaho, R (beginner), Python (beginner) 
  • Practices: Integration Fabric, DevOps, Cloud Computing - GCP, Azure, etc.
  • Other Tools: Hive, Hadoop, Vertica, JIRA, Rational ClearCase, ClearQuest, UXPin, Mind Meister, Balsamic, Visio, CaseWise, AWS, Zeppelin, Jupyter
  • Languages/Operating Systems: Mainframes, Unix, R(Beginner), Python(Beginner), Machine Learning(Beginner)

 

Certifications:

  • AHIP Fundamentals of Healthcare B
  • AHIP Fundamentals of Healthcare A 
  • Relational Databases & Advanced SQL Programming
  • Advanced Mainframes Programming

 

Education:

  • Bachelor of Engineering - Mumbai University
  • Harvard Business Analytics Program - Harvard Business School

 

 

Professional Experience:

 

Tradeshift Inc, San Francisco, CA                                                            Nov 2016 – Till date

Role: Sr.Data Scientist

 

Building and scaling Data Science for a late-stage startup (just acquired Unicorn status) to enhance its platform data capabilities to do market research and predictive analytics on its buyer-supplier network. extensive domain expertise in the B2B space especially in Trade Networks, Supply Chains, Fintech Engagements and Bank Partnerships.

 

  • Developed a Network Growth Forecast model - That shows real-time network growth based on buyers and sellers on the platform. Calculate platform growth through GMV and transactional data analysis 
  • Behavioral Analytics - Built data models from the platform data that can be used to analyze the behavior of buyers and sellers 
  • Solution Architecture - Built real-time data models for various use-cases to support internal analytics projects to monetize and grow customer-base based on platform behavior as well as build new products that support the existing buyer-seller transactions 
  • Real-time Analytics through Tableau - Built live and custom dashboards on Tableau that supports various engineering, product, and support teams based on their use-cases 
  • Project Management - Built and growing a Data Science team by coordinating with all internal Engineering and non-Engineering teams and streamline data usage as well as convey data gaps and findings to all internal stakeholders 
  • Strategy and Operations - Communicate data findings directly to the CEO, VP of Operations and Strategy as well as Product Managers to help make important decisions and build a strategy around data as well as enhance platform and product 
  • Utilize domain expertise in B2B space, supply chains, bank alliances, and fintech products to develop data-driven solutions for clients.
  • Utilize Zeppelin notebooks to perform exploratory data analysis and develop models.
  • Build and manage databases using Postgres and AWS to store and process large volumes of data.
  • Collaborate with cross-functional teams to ensure successful project delivery.
  • Use Spark SQL, Scala, Tableau, Domoto perform data analysis and visualization.

 

 

Environment: Tech Stack: AWS (S3), Zeppelin, Postgres, Scala, Spark SQL, Tableau, Domo, SFDC

Deliverables: A/B Experimentation output/documentation, ROI Cost Analysis, Due diligence, White Paper, Case Study building on specific projects, Process documentation, Program development/management, Hypothesis Presentation, Triage Meetings

 

 

 

 

PricewaterhouseCoopers LLC, San Francisco, CA                                Sep 2012 – Nov 2016

Role: Manager, Data Analytics

 

Support a large-scale enterprise by leveraging crowd-sourcing for Content Management, Data Analysis, and Reporting. Industries: Information Management, Financial Services, Healthcare, Media and Entertainment PwC Projects: Integrated Design Studio, Rainfall Model, Integration Fabric, PwC Digital Integrated Platform in collaboration with the PwC Experience Center, Technology Point Partner Reporting. Extensive domain expertise in Cloud technology, Database management, Technology Partnerships, Audit and Governance Practices, Litigation related data analysis, etc.

 

 

  • Product Management, Information Management, Data Visualization, and Content Management through Tableau, DOMO, and QlikView, and agile methodologies to create Information Design Studio (IDS), the in-house data analytics platform, to share KPIs, data analysis, and visualization information within PwC 
  • Project Management and data architecture for DevOps Platform on Cloud with an Integration Fabric layer (Mulesoft) designed to integrate and bring together all the certified and on-premise digital services & applications 
  • Worked with teams across lines of services to build Financial reports from Revenue, HR, and Sales data to drive decisions related to client staffing, opportunity pipeline, and revenue 
  • Data analysis for a Channel audit and risk assessment on 3rd Party Channel Partners, Deals Pricing, Sales Operations, etc. following a set of complex use cases on dataflow and usage 
  • Risk assessment to mitigate audit-related risk and misuse of software assets business process analysis and mapping around software license compliance assessment and framework 
  • Disseminate and collect web-based survey data and conduct data analysis to provide strategic recommendations for risk assessment projects 
  • Software Asset Management and assessment to provide Software License review and ITIL process mapping and improvement strategy 
  • Worked on various data and analytics projects, mostly in the tech practice.
  • Developed solutions using GCP and Azure.
  • Used Tableau and Domo to perform data analysis and visualization.
  • Collaborated with cross-functional teams to ensure successful project delivery.

 

Environment: SQL, GCP, Azure, Tableau, Domo, Python, R, Unix, Pentaho, Balsamic, Power BI, Looker, QlikView, Opportunity data analysis and visualization through Domo and Tableau, Data Pipeline deployment in SQL/GCP and Azure Computing

Deliverables: Due Diligence, Process documentation, Data Analysis in SQL

 

Infosys, Wells Fargo, St. Louis                                                               May 2009 – Aug 2012

Role: Quality Assurance Manager

 

  • Designed an automated testing model to test the Financial data flowing into primary sources used by Business and Financial Analysts. Used this model to test data flowing from 3rd party vendors involved in the Wachovia and Wells Fargo merger.
  • Cleaning and data mining for testing in Lower environments and PROD environments
  • Worked in Agile environment as code is delivered in Sprints
  • Ran daily status reports based on test cases' complete status and defects.
  • Performed E2E testing and executed regression test cases based on release or launch schedules.

 

 

Environment: Unix/DB2

Deliverables: Business Analysis, PMO, Estimation and Forecast Reporting Test Strategy Consultanting

 

Infosys, Wachovia, Chicago                                                                    May 2008 – May 2009

Role: Technical Test Lead

 

  • Designed an estimation and forecasting data model to report and analyze results as part of the Wachovia and Wells Fargo merger. Scalability testing expert for the Data Management team handling this project.
  • Developed Test plan and reviewed with PM, PMO, and Business users for each Phase as Migration is happening in phases
  • Writing and executing test cases and logging defects 

 

Environment: Unix/Db2

Deliverables: DB2, BI Tools – Autosys, Informatica, Data Analysis, Data Management and Performance reporting Business Analysis, Requirements Gathering, Technical Writing Module Lead

 

 

Infosys, Aetna, Philadelphia                                                               Oct 2007 – May 2008

Role: Business Analyst

 

  • Subject matter expert for the backend data model used for Aetna's mainframes. Business analyst to design front-end systems supporting this model primarily used by Aetna's Medicare Analysts and Business Users. 
  • Provided consultative expertise to Financial and Healthcare Organizations in support of Business Users and Engineering teams.

 

Environment: IBM Mainframes, COBOL, JCL, DB2

Deliverables: DB2/SQL, Technical Business Requirements, Functional Business Requirements, Use Case Scenarios, Persona building, User requirement gathering sessions, ClearCase/ClearQuest documentation

 

 

Infosys, Pune                                                               Oct 2004 – Sep 2007

Role: Software Programmer

  • Designed and developed Aetna's back-end and front-end Medicare systems on IBM Mainframes.

 

Environment: 

  • Business Analysis, PMO, Estimation and Forecast Reporting Test Strategy Consultanting
  • DB2, BI Tools – Autosys, Informatica, Data Analysis, Data Management and Performance reporting Business Analysis, Requirements Gathering, Technical Writing Module Lead,
  • COBOL, JCL, DB2, Mainframes Developer and Test Analyst 

 

 

 

 

 

 

Education

Education:

  • Bachelor of Engineering - Mumbai University
  • Harvard Business Analytics Program - Harvard Business School

 

Professionals in the same Upper Management / Consulting sector as kshitija gupte

Professionals from different sectors near San Francisco, San Francisco County

Jobs near San Francisco, San Francisco County


  • BayMark Health Services San Francisco, United States Full time

    Compassionate Certified Substance Use Counselor Needed The Substance Use Counselor will be in charge with assisting patients through medically assisting treatments and offering techniques for handling opioid use. Counselor conducts individual sessions and group sessions to assist ...


  • Ultradent West Valley City, United States

    Environmental, Health, and Safety (EHS) Specialist - Chemical Safety & Security FocusUltradent, a global name in oral healthcare, is looking for a bright, humble, and motivated?Environmental, Health, and Safety (EHS) Specialist?to support our chemical safety and security initiati ...

  • GoAnimate

    Security Engineer

    2 weeks ago


    GoAnimate San Mateo, United States

    In your role as Security Engineer, you will help the company implement established information security policy. You will contribute to the implementation and maintenance of our information security program, such as: · Review daily company processes and ensure the needed controls ...