About Aumni Techworks
Established in 2016, Aumni Techworks partners with its multinational clients to incubate and operate remote teams in India using the AumniBOT model. With a team of 250 and growing, our mission is to provide a quality alternative to project-based outsourcing.
Benefits of working at Aumni Techworks:
● Work within a product team on cutting edge tech with one of the best pay packages.
● No politics, no bench, voice your opinion, flat hierarchy, and global exposure.
● Work environment to re-live our fun college days (awarded as Best culture by Pune Mirror)
● Recharge frequently with Friday socials, dance classes, theme parties and monsoon picnic.
● Breakout spaces at the office – Gym, Pool, TT, Foosball and Carrom
● Health focused – Insurance coverage and get in shape with AumniFit (Do not miss our 4 PM plank!)
Job Description:
As a Data Engineer in data team, you will work directly with Software.
Engineering and Product teams to continuously improve our data infrastructure, design, tools and pipelines. Your work will directly influence and drive organizational insights, customer facing features and machine learning models.
Roles and Responsibilities:
● Architecture design and implementation of next generation data pipelines, analytics, and BI
solutions
● Manage AWS resources including RDS, EMR, MWAA, Lambda etc.
● Build and deliver high quality data architecture and pipelines to support business analysts.
and data scientists
● Interface with other technology teams to extract, transform, and load data from a wide
variety of data sources
● Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers
Requirement:
● 3-5years of data engineering experience
● Experience with data modeling, warehousing and building ETL pipelines
● Experience programming with at least one modern language, preferably Python
● Experience working on and delivering end to end projects independently
● Knowledge of distributed systems as it pertains to data storage and computing
● 2+ years of analyzing and interpreting data with Postgres, NoSQL etc. experience
● 2+ Years of hands-on experience with big data framework and storage mechanisms like
● Apache Spark, EMR, Glue, Data Lake, and BI tools like Tableau
● Geospatial and time series data experience
Desired Skills-
● Collaborate with Software Engineers, Product Managers, Data Scientists and Business
Intelligence Engineers to design, plan and deliver on high priority data initiatives serving
internal stakeholders.
● Build automated, fault tolerant and scalable data solutions leveraging state of the art
technologies including but not limited to Spark, EMR, Python, AirFlow, Glue and S3.
● Look around corners and be creative - Continuously evaluate and improve our strategy,
architecture, tooling and codebase to maximize performance, scalability and availability.