Keywords: Machine Learning, Natural Language, Language Technology, Data Mining, Knowledge Discovery, Human Factors, User Interface, Information Retrieval, Technical Analysis, Evaluation, Validation, Bookmaker
David’s major research interests lie in the general area of Artificial Intelligence and Cognitive Science, taking Language, Learning and Logic as the cornerstones for a broad Cognitive Science perspective on Artificial Intelligence and its practical applications – with a particular focus on Medical Devices and Assistive Technologies for people with health problems or disabilities, and applications relating to information Retrieval/web-search, and Education..
My principal research interest is Natural Language Learning (NLL), otherwise known as Machine Learning of Natural Language and Ontology (MLNL, MLNLO) or Computational Psycholinguistics or Computational Cognitive Linguistics. The focus is biologically plausible models, and while implementation techniques may very from neural networks to statistical models or grammar-like rule based systems, the emphasis is always that the model could be implemented using in a neurologically plausible way consistent with known neuroanatomy and neurophysiology. Related interest thus include Brain Imaging and Neuroscience as well as Cognitive Psychology.
The idea is that rather than trying to figure out the grammar of a natural human language like English, or trying to learn it from various linguists' ideas of how sentences should be parsed, we get the computers to learn it the same way a baby does - which includes embedding the computer in a real or simulated world so that it can learn about language and the world and the connections between them which give rise to meaning. Conversely, we seek to implement and formally test models and theories from Psycholinguistics and Cognitive Linguistics.
Current projects are grounded using a robot baby/doll, a building mapping robot, simulated robots and animated talking heads. Application projects focus on design and human factors evaluation of search interfaces, as well as techniques for improved parsing, semantic and ontological understanding and audio-visual speech recognition using lip-reading and expression/emotion recognition. See theFMDAT Thinking Head site and CSEM Thinking Head site for more information about the Thinking Head project. Our Robot/Head research also includes research into speech and vision processing and the automatic learning of grounded ontologies.
We are also interested in what is going on inside human heads, including modelling and understanding human learning, cognition and affective behaviour. In this context we use electroencephalography to study brain activity, and conversely we also develop neuromarkers to detect specific cognitive behaviour or to allow direct brain control of external devices such as computers, homes and wheelchairs. See theFMDAT Brain Computer Interface page for more information about our EEG research and applications.
A computer science/engineering focus is Logic Programming (LP) and Automated Reasoning (AR), with particular attention to Concurrency and Parallelism, and the application of LP/AR technology to Natural Language and Machine Learning, and their relationship to biological computing and the understanding of molecular processes. A motto of this research is “the intelligence is in the network”. This includes not only traditional arithmetic computation and paradigmatic computation such as unification, but in particular research into bioplausible models of synchrony, binding and stereology (combined stereo world modelling using multiple sense rather than just audition or vision). A major focus of this research is intelligent adaptive self-organizing neural networks, including spiking neurons and their role synchrony and binding, as well as evolutionary/genetic models.
We have PhD scholarships covering full fees and living allowances available for both Australians (closing 31 October and 31 May) and International applicants (closing 31 August). We also can provide small Honours and Masters coursework scholarships for students who have completed an undergraduate degree and are relocating to Flinders to undertake a follow-on coursework degree (applications close 30 June and 31 January).
Other applications of Language Technology, Machine Learning and Data Mining I've worked on include Technical Analysis, Home Automation, Information Retrieval and Web Search and I collaborate with a number of companies (notably I2Net and YourAmigo), commercializing technology in these areas. We are currently working with organizations including DSTO, Education.AU, Novitatech, BioX and Molechecks. Several past projects/collaborations have led to significant commercial products: The I2Net Orion product allows you to control your home and entertainment system hands-free by talking to it, and is marketed worldwide as Clipsal Homespeak as part of the Clipsal C-bus range; YourAmigo is an acknowledged world leader in deep/dynamic web search, bringing to light pages that are missed by conventional search engines/spidering and boosting rankings and sales of client sites. YourAmigo has offices in several countries and websites in China, Japan, the US, the UK, Germany, Spain, Brazil and Australia. Optimized customer sites include www.reebokstore.co.uk, www.samsungparts.com. www.flightcentre.com.au, www.improvementsdepot.com and www.readycomponents.com.
Much of our recent research has focussed on the validation, evaluation and fusion of computational models including the development of the concepts of informedness and markedness as principled alternatives to accuracy, recall and precision – informedness is the probability that an informed decision was made, or equivalently the proportion of the time a correct decision is made using the available information. It is calculated using the Bookmaker Algorithm and can also be mapped to an information theoretic framework as information loss. Where information is used to make the wrong decision - that is the wrong decision is made more often than predicted by chance - Bookmaker returns a negative value interpretable as misinformedness. Markedness is the complementary concept of to what extent a potential marker is indeed marked by the target condition – it corresponds to DeltaP as used in Psychology.
My interest in machine learning and neural nets applied to technical analysis led to the development of the Bookmaker informedness methodology, but also derives from my activities as a private trader of shares, futures and other derivatives - I have been quite successful with the SFE's SPI contract. One of the tools I developed to assist me in my futures trading, ChartAnnotate, is quite spectacular with its rainbow marking of the standard deviations, and has started to attract considerable interest. It turns bar charts into distribution profiles similar to the Market Profiles introduced by Steidlmayer at the Chicago Board of Trade. Comparing these volume, tick or time distributions with a standard normal distribution gives an indication of the tendency of the market. For more information on this trading side of me, see my private home page.
For up-to-date information on our broader research collaborations at Flinders see the Knowledge Discovery, Intelligent Systems and Language Technology pages. I am a member of four Areas of Strategic Research Investment at Flinders, Flinders Medical Devices And Technologies and the Centre for Neuroscience, Applied Cognitive Psychology and the Centre for Analysis of Educational Futures. For information on our courses see http://www.flinders.edu.au/engineering/ and http://www.csem.flinders.edu.au/.
My publications include a monograph Machine Learning of Natural Language, around a hundred papers, and a number of proceedings in the area of NLL/MLNL/MLNLO (which I coined as increasingly formal and complete designations for the field). My tutorials and my ChartAnnotate software for technical analysis/trading of financial markets may be found on my private home page. My publications may in many cases be downloaded from publications as well as a collection of cached pdfs.
I supervise, and am happy to supervise, projects related to the NL, ML and CLP research mentioned here. Examples of projects I have supervised may be extracted from my publications list. My main collaborators at present are Sherry Randhawa and Nasser Asgari (Robotics), Kenneth Pope (Blind Signal Separation), Richard Clark and John Willoughby (EEG and ERP of Learning/Language + BCI). My current postdocs are Richard Leibbrandt, Dongqiang Yang, Sean Fitzgibbon, Martin Luerssen and Trent Lewis.
My father, Dr B. Ward Powers, director of Tyndale College, is a theologian and also a linguist and has written several books including a text book with an innovative modern approach to New Testament Greek (the topic of his Masters), a new analysis of the Greek third declension, and a comprehensive exposition of the Bible’s teaching on sex, marriage and divorce and the role of women in the church (topic arising from his PhD). If it's not me you're after, maybe it's him! We've also recently started a blog where he answers questions and discusses these issues.
To ring me from outside Australia (50 countries) for the cost of a local call: