- Translate the business requirements into technical requirements
- ETL development using native and 3rd party tools and utilities
- Write and optimize complex SQL and shell scripts
- Design and develop code, scripts, and data pipelines that leverage structured and unstructured data
- Data ingestion pipelines and ETL processing, including low-latency data acquisition and stream processing
- Design and develop processes/procedures for integration of data warehouse solutions in an operative IT environment.
- Monitoring performance and advising any necessary configurations & infrastructure changes.
- Create and maintain technical documentation that is required in supporting solutions.
- Coordinate with customers to understand their requirements
- Work with Teradata project managers to scope projects and develop a work break down structures and do risk analysis.
- Provide direct support to the solution architect and the solution delivery team.
- Lead a dynamic and collaborative team, demonstrating excellent interpersonal skills and management capabilities.
- Readiness to travel to customer sites for short, medium, or long-term engagements.
Skills and Qualifications
- BS. / M.S. in Computer Sciences or related field
- Hands on experience on Spark using Scala/python and utilizing Data frames and Spark SQL API for faster processing of data.
- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming or Flink or Kafka.
- Hands on experience with Big Data technologies such as Pig, Oozie, Sqoop, Hive, Impala, Nifi, Airflow
- Working experience with one or more Cloud environments like AWS, Azure or GCP along with hands on experience with different cloud services.
- Software development experience, working with a variety of programming languages such as Java, AngularJS, C++, Python, Groovy, Scala etc.
- Experience with any of the Hadoop distributions such as Cloudera/Hortonworks
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
- Good Knowledge of one or more ETL tools like Informatica, DataStage, Talend etc.
- Strong concepts/experience of designing and developing ETL architectures.
- Strong RDBMS concepts and SQL development skills
- Strong knowledge of data warehousing, data modeling and mapping techniques.
- Experience with Data Integration from multiple data sources
- Working experience in one or more business areas and industries: Telecom, Retail, Financial etc.
- Training/Certification on Teradata or any Cloud Environment (AWS, Azure, GCP) or Cloudera/Hortonworks will be a plus.
- Excellent communication and presentation skills, both verbal and written
- Proven experience in customer facing roles for large engagements and managing solution delivery teams.
- Ability to solve problems using a creative and logical mind set
- Demonstrated skills in team leadership, coaching, and competency building
- Must be self-motivated, analytical, detail oriented, organized and pursue excellence on all tasks
MS SQL Data Analytics ETL Tools Amazon Web Services (AWS)
- Job Type : Full Time
- Industry : IT / Telecom
- Educational Specialization : Information Technology
- Role / Designation : [Teradata] Lead Data Engineer - Big Data
- Last Date to Apply : 17/09/2022
- Salary : PKR. 200,000 - 250,000/Month
- Requirements : MS SQL Data Analytics ETL Tools Amazon Web Services (AWS)
- Apply Online