AT&T was the first to deliver advanced telecommunications and technology services to companies headquartered in India. That means big things for your career.
Not only will you join one of the most exciting telecom markets in the world, you’ll bring your technology expertise to local as well as international customers.
Impact and Influence : This position will interact on a consistent basis with : architects, leads, data engineers, data scientists to support cutting edge analytics in development and design of data Acquisitions, data ingestions and data products.
Developers will be working on Big data technologies and platforms on prem and cloud. The resources will be involved in following and contributing to best practices in for software development in the big data ecosystem.
Our big data platforms are based on the Hortonworks distribution so candidates should be expected to be heaviliy involved with all components of the HDP & HDF packges.
Roles and Responsibilities : Key Responsibilities.
Development of high performance, distributed computing tasks using Big Data technologies such as Hadoop, NoSQL, text mining and other distributed environment technologies based on the needs of the Big Data organization.
Use Big Data programming languages and technology, write code, complete programming and documentation, and perform testing and debugging of various applications.
Analyze, design, program, debug and modify software enhancements and / or new products used in distributed, large scale analytics solutions.
Interacts with data scientists and industry experts to understand how data needs to be converted, loaded and presented.
Provide rich insight into consumer behaviors, preferences and experiences in order to improve the customer experience across a broad range of vertical market.
Key Competencies and Skills : Technical Skills Required :
In depth knowledge & experience in Hadoop around all the Hadoop ecosystem (HDP, HDF, nifi, M / R, Hive, pig, Spark / scala, kafka, Hbase, Elastic search and log stash a plus)
Expert in working on Linux / Unix
Cloud based Experience (AWS)
Good understanding & experience with Performance and Performance tuning for complex S / W projects mainly around large scale and low latency
Experience with data flow Design & Architecture
NoSQL experience like MongoDB or PostgresDB
Hadoop certifications / AWS certifications a plus.
Experience in understanding Java programs and troubleshooting.
Excellent communication skills.
Ability to work in a fast-paced, team oriented environment.Required / Desired Skills
Unix / Linux shell scripting (Required 4 years)
Java Script (Desired 2 years)
Spark (Desired 2 years)
AWS experience (Desired 1 year)Education and Qualifications : Education : University Degree in Computer Science and / or AnalyticsMinimum Experience required : 3-
5 year experience in Big Data Design / DevelopmentAdditional Information : Afternoon shift
Roles and Responsibilities : Key Competencies and Skills : Education and Qualifications : Additional Information :
Interesting to work with AT&T which always expected their employees to groom.
You will always get the credit for your work and also will be appreciated. Even in this much of big team you will always be recognisable.
Deadlines they are giving is very short period. So, always need to extend our times in the office and sometimes need to work on holidays.
Current Employee - QA Tester