|Winner of the
TR100 award, given to the
world top 100 young innovators in technology.
- Main positions held:
- CWI [research scientist], MIT Lab for Computer Science [research scientist], MIT World Wide Web Consortium (W3C) [research scientist], UNIPD [professor], Volunia [CTO], European Institute for Science, Media and Democracy [CTO].
- Main world-impact technologies:
P3P (Platform for Privacy Preferences) [the world standard for web privacy],
XML-Query (XQuery) [the world standard for semi-structured data handling],
Web Ontology Language (OWL) [the world standard for web knowledge],
APPEL [proposed privacy selection world standard],
Metalog [the world first semantic web system],
Volunia [the world first social search engine and meta mappings system],
Theory of Vaccines [optimal correctness criteria],
SEP (search engine persuasion) [the world first formal study on SEO],
Hyperinformation [the world first hyper technique for search engine], and
Hyper Search [the world first hyper search engine, Google's forerunner].
B.S. degree with highest honors, and M.S. degree in Mathematics summa cum laude, where the lowest mark throughout all the studies has never been below the maximum (30/30).
Ph.D. in Computational Mathematics and Computer Science, with a thesis that won an EATCS (European Association for Theoretical Computer Science) best Ph.D. thesis award of the year for the invention of local analysis in programming languages.
Research scientist at the CWI (The Dutch National Research Center)
Research scientist at MIT (USA) in the World Wide Web Consortium (W3C) (mostly in the Technology & Society Domain, and partly in the Architecture Domain as well). Honoured to have been working, and now to be a friend, with a man of wisdom like Tim Berners-Lee, inventor of the web and also a great person.
Initiator of the Query Languages effort at W3C (see for instance QL'98). I started the XML-Query (XQuery) project (of which I've been the W3C contact for quite a number of years...), deemed to develop the world standard for querying XML, and finally provide the due integration between the Web and the database world.
Co-founder and Executive Committee member of the European Commission Network of Excellence on Web Reasoning, REWERSE.
Been consultant for the European Commission on Knowledge Technologies, and scientific/strategic consultant for a variety of organizations, entities and private companies. Currently cooperating with a variety of companies in various industrial projects (esp. Web and IoT).
Been W3C's responsible for the IEEE Internet Best Practices Activity
Been chair and/or pc member in lots of various events; for example, rolling history back, chair of Query Languages'98 (QL'98), in the Program Committee of the Eighth International World Wide Web Conference (WWW8), of the Nineth International World Wide Web Conference (WWW9) in Amsterdam, Vice-Chair at WWW10, etc etc.
Editor-in-chief of the Open Journal of Web Technologies (OJWT).
CTO (Chief Technical Officer) of the European Institute for Science, Media and Democracy, the Permanent Platform for European Excellence. A major technology that I proposed and that is under development is REISearch, the European Research Excellence Innovation Network (the first comprehensive, interdisciplinary and intersectoral network for European citizens, media and researchers). REISearch got support, among others, from the European parliament and its presidency (Jerzy Buzek and Martin Schulz) and the presidency of the European Commission (Manuel Barroso first, and Jean-Claude Juncker now), and also got important ongoing financial support (million dollars scale) by the Orange Group and the EU (Directorate-General for Communications Networks, Content and Technology).
Ideator and creator of Volunia, the world first social search engine and meta mapping system. With Volunia I have tried to shake the technology world by changing the way we see and navigate the web (each web site is viewed under a city metaphor, and the navigation model becomes visual and much more informative), and how we socially interact with people during our search and navigation process. For more details, see for instance the reviews at Search Engine Land and Search Engine Watch. It's been an amazing technical journey, building world-wide big data systems from scratch, developing innovative algorithms, interfaces, server backends, industrial-scale performance systems, social analysis, real-time massive data handling, and a wealth of other topics. This new multi-million dollar social search engine, built for world-wide audience and localized into the 12 main planet languages, has stirred huge interest and media impact all over the world already in its public beta phase: then, at public launch, I have been meanly ousted from the company by the funding enterpreneur, which wanted to take full control over the company (first part of the story here). There is much more to say, but the rest of the story pertains to morality and dishonesty in business, not technical innovation.
Got a variety of Awards, including for instance the TR100 award given to the world top 100 young innovators in technology, the Gini Foundation Award for innovative research, the IBM young scientist award, the Lifetime Membership Award of the Oxford Society for "his lifetime achievements, and the efforts for the development of an XML Query standard", the Masi Award.
Successfully attacked a variety of open research problems, like:
- the last open problem (which had remained unsolved for 10 years) in the field of basic modularity for rewriting systems (Journal of Symbolic Computation, vol.22)
- the open problem of existential termination in logic programs
- the open problem of constrained evolution of neural networks
Been the beginner in some important fields:
- Modularity: the inventor of the theory of optimal modular analysis for complex systems ("local analysis").
- Complex systems (as the word says... "complex") are too heavy to analyze as a whole, so what happens is that people can try to apply modular analysis to them: this means, studying smaller pieces of the system, and from that inferring properties of the whole complex system. Before local analysis, this was more of an art, with no formal understanding on the reasons, and above all the limitations, of why/when studying the "small" was better than studying the "large". Local Analysis provided the formal setting to understand the differences, and enabled to study precisely what are the limits of the "small versus the large". A byproduct of Local Analysis has been the Theory of Vaccines, showing how the best analysis in the small can be obtained by modeling the idea of "vaccine" (common to medicine) within the apparently unrelated field of computer science.
- Web Search Engines: the world's first to introduce the concept of hyperinformation in search engines, idea later developed for example in Google
- There has been a big shift in the way search engines like Google work today, due to the quantum leap of considering so-called "hyper-information", instead of just normal "textual information", which had been used in the past. Hyperinformation has been introduced in 1996 (original report, first hypersearch engine built and running, submission to WWW6), and then has been also accepted at WWW6 (final publication in 1997), with a critical analysis of the current situation at the time. Hyperinformation can be used in at least two ways, one is to measure "visibility", the other is to measure the "real potential" of a page. The second is what I indicated how the best one to use. After the presentation about hyperinformation at WWW6, Larry Page and I had a nice table discussion on the pros and cons of the two ways of using hyperinformation. Next year, Page and Brin wrote a paper about PageRank (correctly citing the hyperinformation predecessor), which is essentially a "visibility" way of using hyperinformation (the dual of the "potential"), and soon after, developed Google basing it on this hyperinformation measure.
- Web Advertisement & Search Engines: the world's first to formally introduce and study the problem of search engine persuasion (sep)
- Nowadays, concepts like "search engine optimization", "search
engine spam" and so on are well-known: these things describe how to
artificially "pump up" web pages in the top ten of search engines.
There are lot of companies that charge you for such services, and all
sorts of techniques are employed.
The first study of such problem, which is more formally called sep (for search engine persuasion) appeared back in 1997, with the article "Security of World Wide Web Search engines", that for the first time formally introduced the concept, and explained and classified the possible techniques and solutions. Amazingly enough, what looked like visionary science-fiction in 1997 (the same concept, and techniques like invisibility and so on) got realized years later, and is now common-place tech in a fluorishing market of utmost importance.
- Semantic Web: the world's first semantic web system (Metalog, the next-generation system for querying and reasoning in the semantic web)
- "Semantic Web" is nowadays a common word for a boosting field. Back in time (1998), however, this was just a cool word for few intimates, gathering with the inventor of the Web (Tim Berners-Lee) in front of a whiteboard at MIT. During those times, the need to pass from a visionary idea to practice emerged, and this lead to the development of Metalog, the first semantic web system to be produced in history. Metalog allowed querying and reasoning for the Semantic Web, and more than that: to do it in a way that people can find easy and natural, via a pseudo natural language interface. Again, what just seemed visionary at the time is now a boom, and the Semantic Web is nowadays a huge emerging area, with lot of applications and papers being developed, towards the original vision of a third-generation Web of Information.