JobHire
face icon
Register to automatically apply for this and similar jobs
Register
star

Perception Engineer (The Vision Architect)

Unreal Gigs

Austin, texas


Job Details

Full-time


Full Job Description

Are you passionate about giving machines the ability to see, interpret, and interact with their environments? Do you excel at developing perception algorithms that enable robots and autonomous systems to understand and navigate complex surroundings? If you’re ready to design the “eyes and brain” of intelligent systems, our client has the perfect role for you. We’re looking for a Perception Engineer (aka The Vision Architect) to develop and integrate perception technologies that enhance the situational awareness and autonomy of advanced robotic systems.

As a Perception Engineer at our client, you’ll work closely with robotics, software, and AI teams to create robust algorithms for object detection, scene understanding, and sensor fusion. Your expertise in computer vision, deep learning, and sensor data processing will be essential in building perception systems that operate accurately and reliably in real-world environments.

Key Responsibilities:

  1. Develop Perception Algorithms for Object Detection and Recognition:
    • Design algorithms that allow robots to detect, recognize, and track objects in real time. You’ll use computer vision, deep learning, and data processing techniques to enhance environmental awareness.
  2. Implement Sensor Fusion for Multi-Modal Perception:
    • Integrate data from various sensors, including cameras, LIDAR, radar, and ultrasonic sensors, to create a comprehensive understanding of surroundings. You’ll work with sensor fusion techniques to improve detection accuracy and depth perception.
  3. Optimize and Calibrate Vision Systems for Real-World Performance:
    • Calibrate sensors and fine-tune perception algorithms to handle variable lighting, weather, and other environmental factors. You’ll ensure reliable perception performance under diverse real-world conditions.
  4. Develop and Test Scene Understanding and Localization Systems:
    • Implement scene segmentation, semantic mapping, and localization algorithms to help robots navigate and interact with complex environments. You’ll enable robots to identify terrain, obstacles, and pathways with precision.
  5. Collaborate on System Integration and Hardware Compatibility:
    • Work closely with hardware and software engineering teams to ensure seamless integration of perception systems with robotic hardware. You’ll support compatibility across platforms and efficient data flow within embedded systems.
  6. Test, Validate, and Optimize Perception Systems in Field Environments:
    • Conduct testing in lab and field settings to validate system performance. You’ll collect data, analyze results, and make iterative improvements to enhance perception accuracy and reliability.
  7. Stay Updated on Advancements in Perception and Computer Vision:
    • Continuously research new methods in computer vision, machine learning, and perception technology. You’ll evaluate emerging tools and techniques to keep systems at the cutting edge of robotics.

Requirements

Required Skills:

  • Computer Vision and Deep Learning Expertise: Strong experience with computer vision techniques and deep learning frameworks such as TensorFlow, PyTorch, and OpenCV for real-time perception applications.
  • Proficiency in Sensor Fusion and Data Processing: Knowledge of multi-sensor integration, including fusion algorithms like Kalman filters, and experience working with data from cameras, LIDAR, and radar.
  • Programming Skills in Python, C++, and ROS: Proficiency in Python and C++ for perception algorithm development, with experience in ROS (Robot Operating System) for deployment and testing.
  • Real-Time Optimization and Calibration: Experience in calibrating vision systems and optimizing perception algorithms for real-time performance under variable conditions.
  • Analytical and Problem-Solving Abilities: Strong troubleshooting skills for diagnosing perception issues and refining algorithms to achieve optimal results.

Educational Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Robotics, Electrical Engineering, or a related field. Equivalent experience in perception engineering or computer vision may be considered.
  • Certifications or specialized coursework in computer vision, machine learning, or AI are advantageous.

Experience Requirements:

  • 3+ years of experience in perception engineering or computer vision, with a focus on robotics, autonomous systems, or sensor-based applications.
  • Experience with perception system testing in both lab and field environments is beneficial.
  • Familiarity with embedded vision systems (e.g., NVIDIA Jetson, Intel RealSense) is a plus.

Benefits

  • Health and Wellness: Comprehensive medical, dental, and vision insurance plans with low co-pays and premiums.
  • Paid Time Off: Competitive vacation, sick leave, and 20 paid holidays per year.
  • Work-Life Balance: Flexible work schedules and telecommuting options.
  • Professional Development: Opportunities for training, certification reimbursement, and career advancement programs.
  • Wellness Programs: Access to wellness programs, including gym memberships, health screenings, and mental health resources.
  • Life and Disability Insurance: Life insurance and short-term/long-term disability coverage.
  • Employee Assistance Program (EAP): Confidential counseling and support services for personal and professional challenges.
  • Tuition Reimbursement: Financial assistance for continuing education and professional development.
  • Community Engagement: Opportunities to participate in community service and volunteer activities.
  • Recognition Programs: Employee recognition programs to celebrate achievements and milestones.

Get 10x more interviews and get hired faster.

JobHire.AI is the first-ever AI-powered job search automation platformthat finds and applies to relevant job openings until you're hired.

Registration