Omid Madani
I am interested in all aspects of intelligence, and all matters mental! (specially from a
computational perspective) From 2014 to 2024, at Cisco, I developed
machine learning and data analytics pipelines, almost entirely
unsupervised, to help define and enforce network policies and make
data centers more secure. Previously, in reverse chronological order,
I was at Google (the perception group and ml for YouTube, NLP), at SRI
(AI Center), Yahoo! Research, U Alberta, U Washington, U Houston,
Saddleback Community College, and spent my formative years in Iran
(Bandar Abbas and Tehran), then Dubai, UAE.
My research has included the following threads:
- I am inspired by how our minds might work! In particular, the
type of problems that we solve. An important feature of our
intelligence that perhaps separates us humans from many other
animals is the huge number of inter-related concepts that we
(apparently) acquire and develop and effectively use. Here,
by "concept" I mean
a recurring (and useful) perceptual pattern
(supported by a locus of representation in our brain), such
as words and phrases uttered in continuous speech, or visual objects
and entities such as books, faces, common action sequences, etc.
What tasks and problems, processes and algorithms, and
representations and data structures, support developing such
complexity? These questions are very broad and provide a starting
point. I hope to contribute to answering some such questions, in
particular from the perspective of
computational learning and
development.
- I have worked on mostly AI, and machine learning in particular,
now for more than 2 decades! Key problem properties have revolved
around unsupervised and self-supervised learning, online learning,
large-scale learning, models of active learning, multiview learning,
data mining, situated/embedded systems (in rich environments),
imperfect/noisy features and labels/feedback, multiple (learning)
systems interacting, and so on.
- I have been working on efficient learning algorithms for
supervised learning problems with high input and output
dimensionalities (potentially huge numbers of features and
classes/concepts). The focus has been on online algorithms
that can handle dynamic growing sets of classes, possible nonstationarities,
etc.
- Other related threads in my work include empirical and theoretial
analyses of algorithms (in particular, motivated by AI
decision-making/planning problems, such as Markov Decision Processes
(MDPs)), exploration of applications, and discovery of new problems in
the applications. Past and future application areas include
information retrieval, text and natural language processing, game
playing, personalization, vision, and so on.
Please see the following links
for further information on my work.
- Publications
- Selections from my work:
-
NEW:
Tracking Changing Probabilities via Dynamic Learners,
in arXiv, 2024. Continues the work on prediction games: online (lifelong) multiclass probabilistic predictors for non-stationarities.
-
New:
An Information Theoretic Score for Learning Hierarchical Concepts,
Frontiers in Computational Neuroscience (in celebration of the
75th anniversary of Claude Shannon's paper on a mathematical theory of communication), 2023.
-
New Dataset: A dataset of 22 graphs (our work at Cisco),
is now available at the Stanford SNAP (thanks to Rok Sosic). It contains
edges (TCP/UDP) from distributed applications, and two graphs have reference groupings of nodes (ground truth).
Our IWSPA 2022 paper describes the data
(here's the README file).
- A dataset of multimodal
feature vectors, YouTube Multiview Video Games, available
at UCI repository,
specially useful for multiview (multimodal) machine learning
research (also
here in smaller partitions).
-
Presentation video
on "index learning" (efficient linear classifier learning
for many classes, 1000s and beyond) (a precursor to prediction games), posted
by sfbayacm.
- A Python
implementation of sparse EMA ("Emma"), a version of (sparse) index learning,
suitable for non-stationary many-class problems (thanks to Jose Antonio).
A
poem by Omar
Khayyam.