Careers

Job Openings
  • Microsoft Dynamics CRM Developer

    Title: Microsoft Dynamics CRM Developer.

    Location: 715 Reigate Road, Charlotte, North Carolina, 28262.

    Job Duties : 

    • Design and develop the CRM application UI part by using JavaScript and HTML and CSS, custom entity forms and out of the box business rules and workflows.
    • Develop the server-side code to automate the process in CRM by using C# custom plugin code and by registering it on each environment.
    • Prepare technical design document for the project workflow and involve in functional and technical requirement gatherings.
    • Include individual components in partial solutions to make changes on specific entities to deploy changes to other environments.
    • Work on application development and check the coding standards of the tasks which are completed.
    • Migrate the old legacy code such as C#, JavaScript, and Web API services into new versions to work more efficiently in the CRM environment. Involve in Developing Canvas Application to meet the organization requirements.
    • Develop with SOAP and REST services using C# base web/windows application to communicate with Dynamics 365 application.
    • Develop the custom workflow and register it on the plugin registration tool. Work on Developing the Model Driven Application for the better user experience.
    • Create custom dashboards and generate automatic reports on monthly basis based on the specific views and fields which needed on the report.
    • Create new custom ribbon buttons on each entity form and automate the process on each custom entity form to manipulate the CRM data.
    • Implement and develop the SQL queries and scripts for the database according to the requirement and to manipulate the data in CRM.
    • Write the test cases to test the changes in QA environment and validate the code and test the application full functionality before the deployment.

    This position requires, at a minimum, a bachelor’s degree in computer science, computer information systems, computer technology, or a combination of education and experience equating to the U.S. equivalent of a bachelor’s degree in one of the aforementioned subjects.

  • Senior Cloud Developer

    Title : Senior Cloud Developer

    Location : 1845 Bonham lane, Round rock, TX, 78664.

    Job Duties : 

    • Involve in analysis, specification, design, and implementation and testing phases of SDLC and Use agile methodology for developing application.
    • Gather and understand the requirements from business users, breakdown and solve tasks based on priority. 
    • Build API's Using python Django framework and use it to access Database.
    • Develop major functionalities and perform enhancements to python RESTful API services for sending and receiving data between multiple systems. 
    • Use Git to version control and deploy project using jenkins.
    • Perform Code Review using Pytest and unittest for the developed codes and write testcases & prepare for code releases.
    • Perform all aspects of application programming and development including file design, update, storage, and retrieval.
    • Write Python scripts to Parse JSON files and load data into database.
    • Write Test Cases to do code coverage.
    • Migrate code to multiple environments.
    • Work with AZURE Cloud to deploy Web applications and create pipeline in AZURE to move data from different sources.

    This position requires an absolute minimum of a Bachelor’s degree in computer science, computer information systems, information technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.

  • Hadoop Developer

    Title : Hadoop Developer

    Location : 4800 Deerwood Campus Parkway, Jacksonville, Florida, 32246

    Job Duties : 

    • Select and integrate Big Data tools and frameworks required to provide requested capabilities.
    • Understand the requirements of input to output transformations and applying them as per Business requirements.
    • Integration of data flow from IBM DB2 and processing using Ingestion Framework.
    • Implement ETL process using Apache Spark, Scala, Sqoop, SQL, Kafka and IBM DB2.
    • Manage Hadoop cluster, with all included services such as Hive, HBase, MapReduce, Spark, Scala and Sqoop.
    • Clean data as per business requirements by using user defined functions (UDF's) in Spark.
    • Build distributed, reliable and scalable data pipelines to ingest and process data in real-time using Spark-Streaming and Kafka.
    • Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics.
    • Assess the quality of datasets for a Hadoop data lake.
    • Create and develop high performing shell scripts to automate the jobs and deploy them to Control-M scheduler.
    • Troubleshoot and debug any Hadoop ecosystem run time issues.
    • Push the code to Github and Deploy the application using jenkins and post deployment support of application in various environments.
    • Support the migration of applications from current hadoop environment(hdp) to cloudera(cdp).
    • Modify the existing code to support the cloudera platform and do the required unit testing to meet the business requirement.
    • Fine tune the applications for higher performance by running them in cluster mode and modify the dependency jars to higher versions.

    This position requires, at a minimum, a bachelor’s degree in computer science, computer information systems, Technology management, or a combination of education and experience equating to the U.S. equivalent of a bachelor’s degree in one of the aforementioned subjects.

  • Senior Java Developer

    Senior Java Developer with Master’s degree in Computer Science/Applications ,Engineering any, Technology or related & 2 yrs of exp in developing, creating and modifying general computer applications, software or specialized utility programs. Analyse user requirements and develop software solutions. Develop business web applications with technologies such as: Java, Spring boot, Hibernate React.JS, CSS, HTML, Design and development of the presentation layer using JSP/Servlets and Spring MVC Framework, develop and deploy scalable enterprise-wide operations on OpenShift Cloud Plate-forms and Amazon Web Services (AWS).

     

    Work location is West Chester, PA with required travel to client locations throughout the USA. Please mail resumes to 1595 Paoli Pike, Suite 203, West Chester, PA 19380 (OR) e-mail: Jobs@stiersol.com

  • Application Developer

    Application Developer with Master’s degree in Computer Science, Engineering(any),Technology or related and 1 yr of exp to perform code reviews based on defined standards. Coordinating, communicating between Business, onsite team and offshore team. Designing reports, dashboard page with dynamic graphs based on captured details using HTML, CSS, and AJAX, AngularJS. Development of REST API services using customized MVC framework. Development of REST API services using customized API framework, JSON and SOAP. Perform load tests, troubleshooting and fixing issues, monitoring and performance tuning web applications. Knowledge of Unit Testing and Test Automation practices. Review code and software module designs and components to improve quality and reuse. Involve in Enhancement of existing application utilizing AngularJS and Angular created navigation menu. Created prototypes in HTML5, JavaScript and CSS3 for different UI pages.

    Work location is West Chester, PA with required travel to client locations throughout the USA. Please mail resumes to 1595 Paoli Pike, Suite 203, West Chester, PA 19380 (OR) e-mail : Jobs@stiersol.com

  • Hadoop Developer

    Title : Hadoop Developer

    Location : 13440 N 44TH ST, APT#2063, Phoenix, AZ-85032

    Job Duties :

    • Analyze Business Requirement Documents and Implement Technical Solutions for privacy related applications.
    • Identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data.
    •  Develop ETL process for supporting Data Extraction, transformations and loading processing using Sqoop, Magellan,Lucy, Hive,SQL and Spark. 
    • Develop UNIX scripts to load the data from UNIX server to HDFS and validate the files between different servers.
    • Develop the new process to implement state level  privacy regulations based on each state law in BigData Platform.
    • Create Hive managed/external tables, loading with data and writing HQL and Spark SQL queries.
    • Optimize/tunning ETL objects, index and partition for better performance and efficiency.
    • Validate the performance metrics and work on performance tuning for  SQL, HQL and Spark SQL queries.
    • Perform testing and Provide test support for various level of testing phases like Unit, User Acceptance, Regression, Parallel and System testing.
    • Participate in all the release management/deployment activities and providing details of project, back out plans and Implementation of projects into real time environment.
    • Develop Shell scripting  to invoke Hive, Spark, and SQL Scripts and unzip the files and format the data before loading into tables.
    • Promote the components to production environment through CI/CD process by using Jenkins, Git hub and XLR process.
    • Schedule the jobs by setting up the dependencies in Event Engine Scheduler.
    • Monitor and perform root cause analysis on production issues for existing production applications.

    This position requires an absolute minimum of a Bachelor’s degree in computer science, computer information systems,technology management, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.

  • Application Developer

    Job Title : Application Developer

    Job Location : 2222 Addison LN, Johns Creek, Georgia 30005

    Job duties :

    • Develop and write code for  mobile accessibility components to improve application usability/performance using SwiftUI and Combine as part of Sysco brand conversion effort.
    • Implement and develop reactive programming using RxSwift, Swift 5 with MVVM pattern, Protocol oriented programming and interact with third party libraries through Swift Package Manager.
    • Design and develop applications using Swift, Swift UI and work with SDK's - Cocoa touch, Core Services, and third-party frameworks RxSwift, RxGRDB. 
    • Write code and follow the standards of  Bitrise  CI/CD tool  to continuously deliver stable and high performant application with out any issues for users.
    • Develop test cases using automated testing framework  XCTest,  XCUITest and Snapshot testing to ensure application relaiability across all Apple devices.
    • Deploy and maintain applications in the App Store with periodic updates for any bug fixes and new feature releases. 
    • Understand and adhere to  Sysco confluence  page for merging code into GitLab, collaborate with developers/architects to ensure merge requests are reviewed, approved and merged.
    • Re-factor localized files  to support multiple languages, and standardize strings to support both iOS and Android application.
    • Develop the code using RESTful web services  utilizing JSON structure using GraphQL and integrate GRDB.swift wrapper to perform DB Crud operations.
    • Implement code reusability by instantiating reusable views for UIViewControllers, UICollectionViews, UITableViewCells, UIViewControllers and Storyboards.
    •  Participate in Agile Scrum Calls, Requirements Gathering, Code review sessions, Retrospective meetings to deliver the assigned user stories as per the acceptance criteria.
    • Integrate the code to Firebase to track application analytics, crashlytics and Tealium to monitor and capture the performance of application during each trackState and trackAction event calls.
    • Develop and write existing legacy code to latest Swift and SwiftUI versions to adhere to industry standard practices.
    • Work closely with Product Owners, Scrum Masters, Senior Business Analysts, UX teams and other client stakeholders. 

    This position requires an minimum of a Bachelor’s degree in computer science, computer information systems, information technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.

Apply Now
And we'll get back to you within 48 hours.

First Name:

Last Name:

Email:

Phone Number:

Position:

Profile: