Teradata is hiring a Consultant with expertise in Big data and Apache Hadoop and be part of the dynamic Big Data team.
The ideal candidate must be a highly energetic self-starter who can perform complex hands on architecture and development on Hadoop ecosystem. The resource will also be tapped to perform proof of concepts (PoCs) for our customers during pre-sales activities.
Successful candidates will –
- Take a role in consulting engagements and managing day-to-day client relationships and results.
- Ensure that engagements adhere to client strategies.
- Ensure engagement quality.
- Ensure engagement adherence to budget objectives and scope.
- Engage with Teradata Account teams and prospective customers to analyse and understand customer requirements.
- Shape and influence customer requirements so that they are deployed in an optimum Hadoop architecture.
- Assist in qualifying requirements and provide guidance within the Big Data CoE that will enable him / her to determine whether Hadoop is a good fit for the problem that the customer is trying to solve.
- Participate in design, plan and execute on-site/off-site customer proof-of-concepts.
- Configure and use the Horton Hadoop distribution of tools and associated products. Typically Hive, Pig, HCatalog and MapReduce procedural programming languages.
- Partner with the Hadoop administrator to secure and configure the Hadoop cluster to optimise performance and administrate the Hadoop environment.
- Post-POC-execution, document and disseminate the results and lessons learned to all stakeholders.
- Challenges standard approaches
- Have hands-on experience in the design, development or support of Hadoop in an implementation environment at a leading technology vendor or end-user computing organization;
- 1+ years experience implementing ETL/ELT processes with MapReduce, PIG, Hive.
- + years hands on experience with HDFS, and NoSQL database such as HBASE, Cassandra on large data sets
- Hands on experience with NoSQL (e.g. key value store, graph db, document db)
- 2+ years experience in performance tuning and programming languages such as; Shell, C, C++, C#, Java, Python, Perl, R.
- Demonstrate a keen interest in, and solid understanding of, “big data” technology and the business trends that are driving the adoption of this technology;
- Demonstrate analytical and problem solving skills; particularly those that apply to a Big Data environment.
- Maintain a good level of understanding about the Hadoop technology marketplace.
- Strong understanding of data structures, modeling and Data Warehousing.
- Team-oriented individual with excellent interpersonal, planning, coordination, and problem-solving skills.
- High degree of initiative and the ability to work independently and follow-through on assignments.
- Excellent oral and written communications skill.
- BS or MS degree in Computer Science or relevant fields
Teradata empowers companies to achieve high-impact business outcomes through analytics. With a powerful combination of Industry expertise and leading hybrid cloud technologies for data warehousing and big data analytics, Teradata unleashes the potential of great companies. Partnering with top companies around the world, Teradata helps improve customer experience, mitigate risk, drive product innovation, achieve operational excellence, transform finance, and optimize assets. Teradata is recognized by media and industry analysts as a future-focused company for its technological excellence, sustainability, ethics, and business value.
The Teradata culture isn’t just about one kind of person. So many individuals make up who we are, making us that much more unique. It’s what sets apart the dynamic, diverse and collaborative environment that is Teradata. But even as individuals, there’s one thing that we all share —our united goal of making Teradata and our people the best we can be.
- Total Jobs 2 Jobs
- Location United States
- Full Address San Diego, California