What We're Looking For
- Bachelor's Degree or equivalent experience
- 3-5 years of hands-on development experience working with APIs and API Gateway products.
- Building API Proxies and API Policies
- 3-5 years of hands-on MuleSoft experience, MuleSoft Certification is big plus
- At least 7 years of hands-on experience implementing ETL using either native database utilities and/or third-party tools (Talend, SSIS, Informatica, ODI, SQL*Loader, Custom PL SQL packages) in an enterprise environment with a relational database backend
- At least 7 years of experience working with RDBMS OLTP or OLAP data stores, such as Oracle, SQL Server, Sybase, or Teradata
- Relational database development experience and proficiency with advanced SQL concepts such as complex joins, triggers, cursors, correlated sub queries, analytic functions (e.g. Oracle, SQL Server, T-SQL, PL/SQL).
- Experience extracting data from a variety of data stores including relational databases, RESTful web services, LDAP querying and a variety of flat file structures (e.g. EDI, CCD, HL7).
- Experience working with one or more Source Control tools (SVN / TFS / Rational ClearCase etc.) Previous technical experience in new development, maintenance and infrastructure support roles.
- Ability to present accurate development effort estimates and provide timely updates.
- Experience working with end users to gather requirements and build technical solutions from concept to implementation.
- Ability to develop solutions that consider the long and short-term perspectives of business priorities and that are maintainable and extensible.
- Ability to manage and keep track of multiple tasks simultaneously
- Proven ability to develop detailed solution designs.
- Understands architecture constructs (architecture/design patterns, SOAP, REST etc.).
- Experience working with cross-functional technical teams.
- Strong written and verbal communication skills - Detail oriented and well organized
- Understanding of HIPAA and the importance of patient data privacy
- Experience with data modeling including normalization, logical and physical designs.
- Knowledge of software development life cycles and industry best practices Ability to work calmly and constructively under pressure and deliver on commitments
What You'll Be Doing
- Provides feedback loop directly to the product management team to identify product insufficiencies and drive product development.
- Comfortable holding high-level technical discussions on topics like OAuth setup, API, data transformations
- Expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. Work closely with business partners, data scientists, and UI developers in order to transform data into a format that can be easily analyzed.
- Perform data wrangling/cleansing techniques to conduct statistical analysis and identifying key insights for decision-makers.
- Create and maintain optimal data pipeline architecture from various databases that include SQL, Oracle DB, Teradata, Bigdata, etc
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.
- Familiar with Troubleshooting steps
- Assist in creating technical “how-to” documents when necessary - Learn quickly and take on new challenges.