
Website Turing
About the job
A rapidly-emerging company established to address issues with marine sustainability and conservation using the most recent findings in artificial intelligence is looking for a Computer Vision Research Engineer. The selected candidate will be responsible for writing technical papers and scientific essays about the methods developed. The company provides a technology that is tailored to meet the specific needs of each fishery. This will be a great chance for candidates to contribute to ground-breaking work in preserving some of the world’s most important ecosystems and creating a more sustainable future.
Job Responsibilities:
- Help build novel computer vision algorithms for multiple-object tracking
- Create and carry out experiments
- Offer remedies for the flaws in the current algorithms
- Aid the team in expanding into a broader range of AI applications and markets
- Write scientific articles and technical reports regarding the techniques you’ve devised
Job Requirements:
- Master’s degree or Ph.D. in Computer Vision, Engineering, Computer Science (or equivalent experience)
- At least 4+ years of relevant experience in modern computer vision research and development
- 4+ years of Python experience
- 3+ years experience in PyTorch and/or Tensorflow
- Familiarity with the state-of-the-art in object detection, multiple-object tracking, and image classification
- Demonstrated understanding, experience working with, and ability to extend algorithms for these tasks
- Publications in relevant venues (CVPR, ECCV, etc.) or demonstrated deployment of impactful real-world computer vision applications are a plus
- Excellent written and verbal English communication skills
- A track record of executing machine learning projects from start to finish and presenting results
- Understanding of production deployment of machine learning algorithms, including MLOps and cloud-based and edge-based solutions, is a plus
- Grant writing experience is nice to have
To apply for this job please visit www.linkedin.com.