Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth.
Impetus designs and develops state-of-the-art products, spanning all leading-edge development platforms, for a host of fast-paced and innovative software companies. We offer product-oriented services, across the entire product development lifecycle, to our clients.
We keep very successful company-all our clients are leaders in their spaces. They range from some of the finest startups in the Silicon Valley, such as Scalix, KnowNow, Navic, Sennari, to well established players including NeuStar, 3M, LiveCapital and PeopleSoft. Leading blue-chip VCs like Sequoia, Kleiner Perkins, Redrock, and Venrock recommend us as the preferred offshore service provider for companies they are incubating.
Job Description
Experience:- 2-7 Yrs
Location:- Bangalore /Indore /Noida /Gurgaon /Pune /Hyderabad
Immediate Joiners Preferred
Qualifications:
- BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or any other degrees in related fields
Job Description
Looking for candidates with strong experience in software development, especially in Big Data development technologies including Java/Python & Spark/Hive/Hadoop.
Mandatory Skills: Bigdata, Python/JAVA, Hadoop, Hive,SparkSQL/Spark Dataframe
Must have:-
- 2-6 years of experience
- Hands-on experience on Python/Java
- Expertise on SparkQL/ Spark Dataframe
- Any engineering graduate (BE, B.tech, M.tech, MCA or similar)
Good to have:-
- SQL
- Shell script
- Good Knowledge of one of the Workflow engines like Oozie, Autosys
- Agile Development
- Knowledge of any Cloud
- Passionate about exploring new technologies
- Automation approach
- Good communication skills
Roles & Responsibilities:-The selected candidate will work on Data Warehouse Modernization projects and will be responsible for the following activities :-
- Develop programs/scripts in Python/Java + SparkSQL/Spark Dataframe or Python/Java + Cloud native SQL like RedshiftSQL/SnowSQL etc.
- Validation of scripts
- Performance tuning
- Data ingestion from source to the target platform
- Job orchestration