Job Duties:
Work with other members of the technical team and business team to plan and document the digital banking interface and big data analytical system and application Project redesign development processes;
Work with the Project Lead to collect and translate business / user requirements into technical and functional requirements and specifications for the digital banking interface and big data analytical system and application Project;
Work with the Project Lead to perform technical and feasibility studies on the development of the new/enhanced system through re-design of existing system architecture and prepare documentation on the proposed changes to be made to the digital banking interface and big data analytical system and application;
Work with other team members to perform gap analysis and create ‘As-Is’ and “To-Be” business processes work flows and documentation covering the functional and technical work flows for the existing and/or redesign Hadoop (Big Data) system / application;
Participate in daily and/or weekly systems/application’s architectural design reviews, changes to be made, error identification and remedies to be applied to ensure that the new architecture conforms to the technical and functional requirements that have been specified;
Participate in weekly digital banking interface and big data analytical system and application Project status meetings with other members of the Technical Team and Business Team and other stakeholders;
Design, code, program, develop and implement the web-based database models, digital interfaces, flow charts and work flows database models and digital user interfaces using Visual Studio, HTML, C#, ASP, ASP.Net, SQL Script, SQL Server, HTML, XML;
Design, program and implement interfaces and database software code and software scripts for Data conversion and migration;
Perform Big data analytics, data validation and stored procedures developing and using SQL queries, JQuerty and SQL Server;
Installed and configured multiple Hadoop clusters on the platform as per the requirements;
Configured several Hadoop Big Data components on the database clusters such as MapReduce, YARN, Zookeeper, HDFS, Hive, HBase, Spark, Sqoop, Oozie etc. and ensure that they are functioning as expected;
Develop and automate the ETL workflows for importation and analysis of required data from several RDMS sources such as SQL server, Oracle and DB2 into Hadoop (Big Data) using components like Oozie, Sqoop, Hive, UNIX shell scripts and scheduled them to run daily;
Install and configure data analytics tools such as R, SAS on the Hadoop clusters and provide necessary technical support to data scientists to ensure that they work properly;
Monitor multiple clusters on the Hadoop platform and resolving any issue that may occurred, to ensure that the platform is always available for the data scientists to run their jobs;
Develop and implement an automated monitoring application to transmit high alerts if any component of the Hadoop platform is down using Python;
Design, develop and implement HiveQL queries to materialize the existing hive tables to generate data that can directly be used for data analytics;
Review, update, monitor Big data analytics and digital system interface performance;
Configure and administer Spark and provide technical training and support to the Scientist analytics team on use of Spark for use cases;
Perform upgrades on the cluster whenever new version of big data analytics components are released and resolve any issues caused due to upgrade by working with users and product vendor;
Perform big data and database administration, monitoring, upgrade, disaster recovery, and technical support to end-users;
Design, code, programming and installation of redesigned database models and architect and third-party software and hardware;
Perform troubleshooting and system maintenance, backup and disaster recovery;
Establish and monitor user environments, directories and security profiles, and ensure/verify proper functioning of all aspect of the Hadoop platform and database;
Providing 24×7 production, technical support and maintenance.
Minimum Requirements: Bachelor’s degree or higher in Computer Science or related fields of studies and 3 years of experience in software or systems and applications design and development.
Location: Herndon, VA, Washington, DC and Charlotte, NC