Deliver end-to-end data and analytics platforms and capabilities, including requirements assessment, , prototyping, comparative analysis, design and solution implementation
· Responsible for the overall quality of project deliverables and the successful implementation of defined solution for the customer
· Analyze latest Big Data Analytic technologies and their innovative applications in both business intelligence analysis and new service offerings, bring these insights and best practices to architect and implement complex big data solutions
· Provide expertise in big data technology space including cloud architecture, security, data storage, data ingest, data processing, data science, data visualization, etc.
· Support project engagements, work collaboratively with other team members, communicate effectively, and foster team success
· Clearly document findings and recommendations that can be shared both internally and externally.
· Build, maintain, and scale infrastructure for Production, QA, and Dev environments.
· Comfortable briefing internal and external stakeholders on findings and solutions
· Analyze client data and existing systems architecture to determine whether requirements can be met
· Be financially mindful and consider the customer’s best interests while selecting the tools and technologies for the project
Expert experience in the following:
· Hadoop administration, configuration, and performance tuning (much prefer EMR, but Hortonworks and Cloudera experience will be considered)
· Linux administration & configuration
· Linux security and Active Directory integrations
· AWS resources, such as CloudFormation, EC2, S3, RDS, IAM, Lambda, Infrastructure scripting,deployment of VPCs, Subnets, NACL, Route53,Kinesis, Serverless Technologies, etc
· Owning, provisioning, and managing EC2 and Cloudformation stacks
· Bash shell scripting in the AWS cloud environment is critical.
Strong experience in the following:
· Experience working with Serverless framework
· Experience with automation software (e.g., Ansible, Puppet, cfengine, Chef)
· Experience in encryption and key management technologies
· Experience with monitoring systems
· Experience with virtualization and containerization (e.g., VMware, Virtual Box)
· Experience with DevOps concepts such as Infrastructure as Code
· Experience with CI/CD tools and processes
· Experience with agile delivery methodologies using Jira or similar tools
· Experience working with remote teams
· Education: Bachelor Degree in Computer Science or relevant field, Master’s Degree is a plus
· 2 to 4 years of relevant experience or equivalent combination of experience + education.
· Solid scripting skills (e.g., shell scripts, Perl, Ruby, Python)
· Solid networking knowledge (OSI network layers, TCP/IP)
· AWS Solutions Architect certification, Professional certification is a plus
· Must be able to work in the U.S