Overall Position Summary and Objectives We are primarily interested in candidates with expertise in setting up cloud computing environments, machine learning on the cloud, and deployment of machine learning models on the cloud. In particular, we plan to use Amazon AWS services. Having experience with AWS Sagemaker and Amazon Mechanical Turk is highly preferred. The incumbent will work closely with a team of computer scientists, engineers and radiology technologists and support post-doctoral and other trainees.
Minimum Education
Bachelor's
Resume Max Pages
15
Certifications & Licenses
Skills (Ranked By Priority)
Software
Field of Study
Deliverables
Statement of Work Details
Provides technical experience needed to perform analyses, processing and user support of various computer systems using standard statistical procedures and techniques.
Apply Now
Back to Search
Minimum Education
Bachelor's
Resume Max Pages
15
Certifications & Licenses
- AWS Cloud Practitioner (Preferred)
Skills (Ranked By Priority)
- Cloud Computing
- Machine learning
- Scientific Data analysis
Software
- Scikit-learn
- Pytorch
- TensorFlow
Field of Study
- Computer Science
- Computer and Information Systems
- Information Sciences
Deliverables
- Meet with lab members to present updates - Weekly
- Work products and documents related to setting up and main-taining cloud computing environments - Bi-Weekly
- Work products and documents related to performing model deployment on the cloud - Bi-Weekly
- Work products and documents related to troubleshooting problems with the cloud computing environment - Bi-Weekly
- Work products and documents related to setting up and main-taining crowd-sourced labeling tasks on Amazon Mechanical Turk - Bi-Weekly
- Work products and documents related to preparing research data for presentation and manuscripts, and IRB submissions. - Ad-Hoc
Statement of Work Details
Provides technical experience needed to perform analyses, processing and user support of various computer systems using standard statistical procedures and techniques.
- Provide integrated domain expertise across complex computational environments.
- Provide software recommendations and solutions across diverse scientific applications, including freeware relevant to analyses of scientific discovery.
- Provide support and consultation in the areas of scientific computation, application-development and computer modeling.
- Support and maintain scientific computer systems including hardware, operating systems, and associated services in a high-performance computing environment.
- Provide software solutions across diverse scientific applications including large and complex multi-component resources.
- Work with staff on the evolving infrastructure, data engineering pipeline, and data science stacks.
- Set up and maintain a cloud computing environment on AWS for machine learning and deploying models on the cloud
- Perform algorithm development and implementation.
- Manage data formatting input and output.
- Drive collection of new data and the refinement of existing data.
- Prepare data and analysis for presentations and publication.
- Prepare reports and offer solutions supporting ongoing needs assessment and strategic planning related to computer systems management and engineering.
- Provide documentation as required and participate in code reviews, planning sessions, and routine status meetings.
- Educate users in the use of the tools and reports.
- Plan, organize and coordinate formal and informal IT related training and orientation of IT initiatives.
- Support and train users in applying data analysis pipelines and methods for research studies.
- Conduct educational courses on basic scientific computing, data analysis scripts and pipelines.
- Consult and collaborate with users to explain new tools and enable them to be adapted to meet specific research needs.
- Maintain computational infrastructure, including upgrading and configuration of hardware and software systems.
- Mitigate security vulnerabilities; remediate vulnerabilities and ensure that all the Operating Systems are hardened and up to date; administer firewalls, web servers, file servers and database servers.
- Independently monitor internal network systems; address system performance issues.
- Perform routine audits of tracking database and packages.
- Collaborate with staff to increase the productivity/efficiency of data analysis using high-performance computing.
- Create system documentation, electronic templates and examples, training materials and presentations.
- Develop dedicated and efficient computer software using state of the art computer science approaches, including modern programming languages and coed libraries, machine learning, cross-platform programming, and high-performance computing.
- Develop scripts to automate data processing pipelines.
- Manage a data storage and backup for scientific computers using available resources.