Hi all, I'm Mayur 👋

Software Engineer with experience in event-driven systems. Proficiency in operating multi-region distributed streaming platforms like Apache Kafka on AWS and On-premise. Proficient in provisioning and operating Data ETL systems using technologies like Airflow, DBT, Kafka, Snowflake on public cloud like AWS. Ample software development life-cycle experience exercised from start to finish along with prominent communication skills and constructive attitude.

Skills

Programming

  • Java

  • Python

  • Bash

  • JavaScript

  • Typescript

  • C/C++

  • VHDL

  • Verilog

Big Data

  • Kafka

  • Kafka connect

  • Airflow

  • SQL

  • Snowflake

  • DBT

  • JSON

  • Avro

DevOps Tools

  • Terraform

  • Ansible

  • SaltStack

  • GitHub Actions

  • Docker

  • Bamboo

  • Maven

  • GitHub

  • BitBucket

  • SonarQube

AWS Cloud Services

  • EC2

    EC2

  • S3

    S3

  • Secrets Manager

    Secrets Manager

  • Lambda

    Lambda

  • Managed Airflow

    Managed Airflow

  • SageMaker

    SageMaker

  • IAM

    IAM

  • CloudFormation

    CloudFormation

  • SQS

    SQS

  • SNS

    SNS

  • CloudWatch

    CloudWatch

Cloud Providers

  • AWS

  • GCP

Web

  • HTML5

  • CSS

  • JavaScript

  • React

  • Angular

  • Bootstrap

  • Typescript

⚡ Kafka Admin

⚡ Cloud Architect

⚡ DevOps

Experiences

Tucows
Tucows
Staff Engineer
May 2022 – Present

    Tucows
    Tucows
    Data Integration Developer
    July 2021 – May 2022

    • Design, build, secure, implement and maintain Data ETL platform with DBT, Airflow, Kafka and Snowflake
    • Build automation to provision, maintain, and monitor all aspects of Kafka, Airflow and AWS SageMaker environment via infrastructure as code.
    • Successful migration of on-premise data ETL job orchestration system (Airflow) to AWS managed Airflow.
    • Administered Data Lake (Snowflake) using Terraform.
    • Deployed AWS Python Lambdas to create and deliver usage reports.
    • Implemented CI/CD for Machine learning Pipelines using Terraform and AWS SageMaker.
    • Used Terraform, SaltStack, GitHub Actions etc. to implement CI/CD pipelines for Kafka and Airflow.
    • Conceived and wrote detailed software implementations while ensuring that their code adhere to the security, logging, error handling, and performance standards and non-functional requirements.
    • Skills:Application Development(Python, Java, KafkaStreams, KafkaConnect, JUnit),AWS (IAM, S3, Lambda, Managed Airflow, SageMaker, SQS, SNS, CloudWatch, CloudFormation) Other(Terraform, Docker, SaltStack, Airflow, DBT, Snowflake, GitHub Actions, Jira, Confluence, Asana).
    BMO Financial Group
    BMO Financial Group
    Software Developer
    January 2020 – July 2021

    • Design, build, secure, implement and maintain Multi-data-center Kafka infrastructure.
    • Build automation to provision, maintain, and monitor all aspects of the Kafka environment via infrastructure as code.
    • To meet stringent Recovery Point Objective (RPO), converted traditional Kafka clusters to multi region clusters.
    • Helped build real-time streaming data pipelines to reliably get data between systems and applications.
    • Identify bottlenecks and tune to optimize Kafka performance and throughput.
    • Built real-time streaming applications Kafka streams that transform & react to streams of data.
    • Conceived and wrote detailed software implementations while ensuring that their code adhere to the security, logging, error handling, and performance standards and non-functional requirements.
    • Automating on-boarding of clients to Kafka by developing CI/CD pipelines for Kafka Streams, Topics, ACLs, Schemas, Kafka Connectors etc.
    • Skills:Application Development(Java, KafkaStreams, KafkaConnect, JUnit), Other(JSON, AVRO, WSDL, HTML, CSS3, JavaScript, Angular, Git, Ansible, Bamboo, Jira, Confluence).
    TD Bank
    TD Bank
    Software Developer
    March 2018 – June 2019

    • Design & Develop Spring Batch jobs in Java using dependency management and build tool Maven.
    • Develop SQL stored procedures to perform data intensive logic and Shell Scripts for file manipulation, Job dependencies etc.
    • Develop SQL stored procedures to perform data intensive logic.
    • Excellent time-management skills through deadline driven projects and tasks.
    • Skills:Application Development(Java, Spring Batch, JDBC, Hibernate, JUnit), Scripting (Bash, Python), Other(Jenkins, Bitbucket, SonarQube, Confluence, Jira, HTML/CSS).

    Education

    Conestoga College
    Conestoga College
    Post Grad Diploma in Embedded Systems Development

    2016 - 2017

    Dean's Honors List Recognition

    • Relevant Courses: Operating Systems, Data Structures and Algorithms, Embedded Programming (C/C++)
    Government Engineering College, Gandhinagar
    Government Engineering College, Gandhinagar
    Bachelor in Electronics & Communication Engineering

    2012 - 2015

    Ranked top 10% in the program. Took courses about Software Engineering, Web Security, Operating Systems, ...

    • Relevant Courses: C/C++ Programming, Verilog, VSDL, Communication Skills
    N. G. Patel Polytechnic
    N. G. Patel Polytechnic
    Diploma in Electronics & Communication Engineering

    2009 - 2012

    Silver Medal (Recognition for Academic Achievement )

    • Relevant Courses: C/C++ Programming, Verilog, VSDL, Communication Skills

    Mayur Hadole

    Software Engineer

    Toronto, Canada

    Reach out to me!

    Mayur Hadole