Job Description What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is ..
Job Description What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating many opportunities for people like YOU - people with an entrepreneurial spirit who want to make a difference in this world. At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative, and successful. Moreover, this is your chance to be part of the success story. Position Summary: We are looking for a Big Data Developer who will work on the collecting, loading, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company. • Exposure to, or an understanding of a broad cross-section of on prem, hybrid and cloud data platforms, including Azure/AWS. • design and engineer innovative cloud-based solutions in multi-cloud scenarios across the Microsoft Azure/AWS. • 7+ years demonstrated coding experience in batch framework & hive/impala. (Language such as Scala/Java /Python/PL-SQL, Unix) • experience in design, build and implement the Lambda architecture in native cloud. • Expertise in creating data factory pipeline to ingest data from flat files of different formats. • A demonstrated experience in analysing the data stored in data warehouse using SQL. • Expertise in writing stored procedures, functions to process or transform large datasets in SQL database. • Inhouse Proprietary Tool - 3 + years of working experience with JOF Job, Merlin and JOF Ingestion and Replication framework to develop data ingestions pipelines and deploy the code artifact. • Hands on implementation of end-to-end data pipelines using Azure Data Lake, Azure Data factory, Azure Databricks. • Expertise with database services & data lake such as Azure Synapse, Azure SQL, Azure Blob Storage, and Azure Data Lake Storage Gen2. • Solid experience in setting up advanced deployment techniques like CI/CD using Azure DevOps, GITHUB integration, Jenkins, Bamboo & etc. • Should involve in End-to-End project execution, Requirement gathering, transforming legacy design to Bigdata eco system. Development, Testing, UAT Support and GO Live support. • Understanding and creating Tableau/PowerBI reports would be additional advantage. • Should have knowledge of Control M to create, monitor and schedule jobs. Mandatory Skills: • Big data PaaS Services Ecosystem: Hadoop, Flume, Kafka, Spark Streaming, Spark SQL, Impala, SQL DW, Kudu tables. • Languages: - SQL, PL-SQL, Java, UNIX, Scala, Python & Ruby. • Azure Stack: - Azure Function, Azure EventHub, Azure HDInsight, Azure Data Bricks, Azure VM, Azure Data Factory, Azure storage, Azure Data Lake, Azure Cosmos DB, Azure KeyVault, Azure SQLDW(Synapse) • AWS Stack: - Kinesis, Lambda, S3, EC2, DynamoDB, API Gateway, KMS, VPC, CloudWatch, IAM roles & Policies. • Dev-ops Tools: - Azure DevOps, JIRA, GIT, Bitbucket, Bamboo, Jenkins, Confluence. • Client Specific: - JOF Job, Merlin, JOF Ingestion framework, Control M. Duties and Responsibilities: • Associate would be responsible for design, code, test, document and maintain high-quality and scalable Big Data solutions on-prem or on-cloud. • Research, evaluate and deploy new tools frameworks and design patterns to build sustainable Big Data platform. • Proactively communicate and collaborate with stakeholders, product specialists and data architects to capture functional and non-functional requirements. • Make accurate development effort estimates to assist management in project planning. • Responsible for migrating on-premises workloads to the cloud and debug cloud stacks. • Responsible for the design and build of the CI/CD and/or DevOps pipelines and looks to improve existing workflows through deployment automation. • Respond to technical issues in a professional and timely manner. • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Salary Range: >$100,000 Next Steps: If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you! Please apply directly with us. For a complete list of open opportunities with Cognizant, visit http://www.cognizant.com/careers.Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check. Employee Status : Full Time Employee Shift : Day Job Travel : No Job Posting : Jan 30 2024 About Cognizant Cognizant (Nasdaq-100: CTSH) is one of the world's leading professional services companies, transforming clients' business, operating and technology models for the digital era. Our unique industry-based, consultative approach helps clients envision, build and run more innovative and efficient businesses. Headquartered in the U.S., Cognizant is ranked 185 on the Fortune 500 and is consistently listed among the most admired companies in the world. Learn how Cognizant helps clients lead with digital at www.cognizant.com or follow us @Cognizant.