Jans Aasman is a Ph.D. psychologist and expert in the Cognitive Science - as well as CEO of Franz.com, an early innovator in Artificial Intelligence and provider of Semantic Graph Databases and Analytics. As both a scientist and CEO, Dr. Aasman continues to break ground in the areas of Artificial Intelligence and Semantic Databases as he works hand-in-hand with organizations such as Montefiore Medical Center, Blue Cross/Blue Shield, Siemens, Merck, Pfizer, Wells Fargo, BAE Systems as well as US and Foreign governments.
Dr. Aasman is a frequent speaker within the Semantic technology industry and has authored multiple research papers, bylines and is one of 15 CEOs interviewed in a new book, "Startup Best Practices."
Dr. Aasman spent a large part of his professional life in telecommunications research, specializing in applied Artificial Intelligence projects and intelligent user interfaces. He gathered patents in the areas of speech technology, multimodal user interaction, recommendation engines while developing precursor technology for the iPad and Siri from 1995 to 2004. He was also a part-time professor in the Industrial Design department of the Technical University of Delft.
Before joining Franz Inc. in 2004, Dr. Aasman's experience included the following:
KPN Research, the research lab of the major Dutch telecommunication company.
Tenured Professor in Industrial Design at the Technical University of Delft. Title of the chair: Informational Ergonomics of Telematics and Intelligent Products.
Carnegie Mellon University. Visiting Scientist at the Computer Science Department of Prof. Dr. Allan Newell.
Researcher at the Traffic Research Center of the University of Groningen (The Netherlands).
Experimental and cognitive psychology at the University of Groningen, specialization: Psychophysiology, Cognitive Psychology.
2020 Talk: The Knowledge Graph that Listens
Enterprises that are building Knowledge Graphs are rapidly getting a grip on unstructured data with current advances in Natural Language Processing (NLP) techniques. But there is still a large mass of unstructured data that is untapped and that is spoken conversations with customers. Speech to text for general-purpose conversations (e.g. Google, Alexa, Siri) have proven themselves in the market to be highly accurate. However, speech recognition technology for domain-specific industries with lots of product names, industry lingo, and acronyms often creates a challenge for accuracy and usefulness of the content. In this presentation, we will demonstrate how taxonomy-driven speech recognition helps solve these industry-specific terminology challenges for real-time voice capture and how this process augments an Enterprise Knowledge Graph for customer insights enabling real-time decision support.
2019 Talk: The Chasm of a Million Analytics, and How to Bridge it?
While our understanding of healthcare and wellness is defined within a continuum of inter-related clinical, behavioral (socio-economic and cultural), environmental, and genetic factors, the healthcare system is extremely compartmentalized and broken into very many silos of narrowly focused interventions (cardiology, radiology, infectious diseases, surgery, critical care, the list goes on). Analytic solution providers exploit this situation by offering point solutions and disconnected products that are unaware of each other, compounding the fragmentation of healthcare data and analytic ecosystem, unleashing an army of blind men that individually try to find and tame the elephant.
From an organizational perspective, this might be an unsurmountable barrier to industry-wide and large scale adoption of advanced analytics and data driven practice, since there is a hard limit on how many disconnected point-solutions and operational silos a healthcare system can afford to build, buy or subscribe. One has to ask and answer the following questions: How many analytic models (AI driven or otherwise) does it take to transform the healthcare system, overhaul patients clinical experience, and to optimize delivery system (hundreds? thousands? or perhaps millions?)?, or How many point-solutions, or private cloud services will it take for a healthcare system to adopt those analytics and truly transform its practice?
This has resulted in lack of information and coordination across multiple interrelated care practices and supporting services, duplication of efforts, and complications and errors, while opportunities to focus on the patient as a whole and collaborate to provide a holistic, and patient centered care are lost.
In this presentation we are going to introduce "Patient-centered Analytic Learning Machine- PALM", a knowledge-based big-data analytic engine designed to integrate and provide a holistic view of all interrelated components of healthcare revolving around patients and providers. PALM is designed from ground up, based on the principles of graphs and cognitive computing, to link and integrate all clinical, operational, and administrative silos in a healthcare system through use of data, information, knowledge and insight gained through their analysis, and to illustrate a complete picture of the patient experience, and the operational and administrative support systems surrounding them. we will talk about our design philosophy, describe vignettes that illustrate it, and explain the distributed knowledge-graph technology that is enabling PALM under the hood.