Dr. Amir is a research staff member in the Cognitive Computing group at the IBM Almaden Research Center, where he works on the SyNAPSE project in Cognitive Computing. As a member of the software team, he develops the Corelet Programming Language, and also uses it to create some cool corelets - generating efficient networks for the TrueNorth neurosynaptic architecture. Since joining IBM Almaden in 1997, he worked on a number of projects, in computer vision, speech and video indexing and retrieval, eye gaze tracking and human computer interaction.
In 2008 Dr. Amir started and lead the DuraBytes project, where he was the visionary and co-inventor of the IBM Linear Tape File System (LTFS). With the goal to make data tapes more useful for the Media and Entertainment (M&E) industry, Arnon was very intrigued by The Digital Dilemma - a 21st century challenge to M&E and many other industries. By making data tapes self-describing, self-contained and easy to use, the formerly called "backup tape" has now much of the properties and ease of use of a portable hard drive, combined with tape's native high-density, long longevity and low power efficiencies. LTFS makes data tapes well suited for Digital Media Workflows, ironically called also Tapeless Workflows because they are file-based, thus replacing the video-cassette ☺. The IBM LTFS products family is now available as part of IBM Tape Storage Products. LTFS has been adopted by many companies into their own products, and has received broad recognition and several prestigious awards, including an EMMY and the Hollywood Post Alliance Award. (for more about LTFS see LTFS on Wikipedia).
Prior to DuraBytes, Arnon was a member of the AALIM project for multimodal medical decision support. The AALIM system analyzes ECG-s, echocardiograms and other medical records of cardiac patients, finds and groups similar cases, and generates longitudinal personal views and case-related evidence-based medical information, aggregated information, charts and reports.
Arnon's research topics include image and video segmentation, shot boundary detection, speech indexing (both phonetic and speech-recognition based), multimodal video retrieval and efficient video browsing, visualization and summarization. This work has been done while he was a member of the CueVideo project, the Multimedia Mining Adventurous Research project and the Multimedia Understanding and Semantic Extraction (MUSE) project. He was a core member of the IBM team for the NIST TRECVID video retrieval benchmark for six years, since its inception in 2001.
Dr. Amir has special interest in eye gaze tracking systems and their utilization in human-computer interaction. As part of the BlueEyes project he developed methods for calibration-free eye gaze tracking with free head motion, for eye-contact detection, and an embedded system for eye detection.
In his leisure time, Arnon is a marathon runner and a coach at San Jose Fit. Running has changed his life, and through coaching he helps others improve their lives, too.