Apply to this job
You’ll be taken to a third party website to find the job application. You got this!
Software Engineer
at Alamere Software
Posted: 4-23-2025
Remote
Information Technology and Computer Science
$167,030/year
Apply to this job
You’ll be taken to a third party website to find the job application. You got this!
About this Career
Software Developers
Skills
Workflow Management, Management, Apache Hadoop, Technical Requirements, Microsoft Azure, Shell Script, Kerberos (Protocol), Application Development, Bash (Scripting Language), Sqoop, Apache Kafka, Unix, Data Warehousing, Informatica, Big Data, Scheduling, Python (Programming Language), Apache Spark, Application Programming Interface (API), Apache Oozie, Multi-Tenant Cloud Environments, Ansible, Business Requirements, Java (Programming Language), Data Lakes, Hortonworks Data Platform (HDP), SQL (Programming Language), Containerization, Amazon Web Services, Change Management, Peer Review, Computer Engineering, Software Engineering, Hadoop Distributed File System (HDFS), Apache Hive, CI/CD, Computer Science
Job Description
- Interfacing with business internal stakeholders in understanding business requirements and convert them into technical requirements
- Analyze user needs and software requirements to determine feasibility of solution approach, prepare high level design and provide estimations to business
- Architecting Big Data solutions using Hortonworks Data Platform, Amazon Web Services and Apache Hadoop
- Design the Hadoop pipeline to fetch data from external systems and load into Data lake to Data warehouse(DWH)
- Develop and exhibit proof of concept(POC) for the projects
- Develop and build infrastructure pipelines using Cloud and Hybrid Platforms for application development
- Develop programs in Bigdata applications using HDFS, SQOOP, HIVE, OOZIE, SPARK, and PYTHON
- Build and manage multi-tenant environments for near-real time and batch analysis
- Hadoop cluster maintenance and outage mitigation. Upgrading environments per need
- Development of complex programs and mappings and integrating them into workflows
- and scheduling them using Informatica BDM(Bigdata Management) or through control-mscheduler
- Automate cluster configurations and maintenance process using Ansible
- Migrate Streaming Platforms Kafka from physical servers to containerization
- Testing the new tools in lower environment before deploying into production
- Peer reviews and feedback are collected from the various stake holders for better quality and implemented the same for future releases
- Develop API programs through Java, Python and Unix bash scripting for the Big Data applications
- Interact with users, and other project team members on daily basis to perform development and pre-activities
- Telecommuting from anywhere in the US is an option for this position
- The position requires Masters in Computer Science or Computer Engineering, closely related field
- This position also requires education or experience in AWS, Azure, Python, ansible,sql, shell scripting, unix, CICD, Change management, build and release, containerization, orchestration, Big Data stack, ELK, kerberos.
Other Job Posting Details
Salary
Minimum
Maximum
$100,510/yr
$219,440/yr