Job Details

  • Title: Big Data Engineer
  • Code: RCI-VTL-20254
  • Location: Irving Texas (TX) 75038
  • Posted Date: 05/16/2019
  • Duration: 6 Months
  • Status: Open
Talk to our Recruiter

  Job Description

  • Position will be a part of the Business Analytics Center of Excellence (BA CoE) team responsible for KNIME administration and Big data related data management.
  • Responsibilities include but not limited to: Big Data Management & Analysis, KNIME administration & development, Architecture, Code & Software implementation

Few of them listed below:

  • Routine maintenance and support (installation, upgrade, monitoring, and configuration) of the KNIME platform.
  • Health of the KNIME platform (performing platform capacity planning and management).
  • End user support and for escalated production and/or platform issues.
  • Change control and promotion, and track support tickets with external vendor and KNIME.
  • Install R, Python related packages on the linux server.
  • Enable data transfer and management on big data platform using big data suite of products and KNIME server.
  • Develop Sqoop and hive scripts for managing and analyzing the data on big data platform.

Required Skills:

  • Bachelor's degree or six or more years of work experience.
  • Six or more years of relevant work experience.
  • Experience working with data, data warehousing & BI.
  • Minimum of two years as a KNIME administrator.
  • Deep understanding of engineering an entire platform including Linux, Networking and infrastructure.
  • Should experience supporting KNIME environments deployed on Linux and Windows environments.
  • 3+ years of hands-on experience working with KNIME
  • 3+ years of hands-on experience with a proven track record in developing and operationalizing AI/ML models at scale for predictive, prescriptive, and cognitive analytics
  • 3+ years of experience in Data wrangling, advanced analytic modeling, and processing/analyzing structured, semi-structured, and unstructured data
  • Willingness to work in a fast paced and demanding environment.
  • Highly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business ‘dots’
  • Has strong communication and organizational skills and should able to deal with ambiguity while juggling multiple priorities and projects at the same time
  • Experience with Hadoop, Hive, and Pig.
  • Experience on Hortonworks Hadoop Distribution.
  • Experience with big data technology suite products such as Kafka, Flume.
  • Development experience in NOSQL databases such as HBASE or Cassandra.
  • Development experience using Python and R.
  • Experience in Linux.
  • Excellent verbal and communication skills
  • Master's or Bachelor's degree in Computer Science is desired