Every now and then a scientist should step back from his specialization and reflect in the entire research domain around him/her. What are the big, hard problems in my general area of research? I can see a couple:
- Privacy and data retention: we must discover and learn how to build computer systems (hardware + software) that allow people to exchange information while keeping control and accountability over the use and redistribution of their personal details in the long term. We must also do a better job of catering for the future beyond 20-50 years, long after current companies have evolved or disappeared: for example, even if today’s Google is careful about user data, will its successor in 20 years still be so careful?
- Machine models in programming languages: most software in use today have been written in languages whose machine models only have concepts for “processing agents” where computations take place and “memory” where data is stored. However since ~2005 computing platforms have changed and data communication (moving bits from one physical location to another) and energy usage must be managed as well. We must develop and learn how to use new programming languages that have machine models where data communication and power consumption can be analyzed and managed.
- Hardware architecture models: all hardware architectures today implement a variant of Turing/register machines, and all other machine models used in software design, especially actors, process networks and graph reduction, must be simulated on top of them; this simulation entails efficiency and performance overheads and higher risks of bugs and processing errors. To make computers faster, more energy efficient and safer, we must discover and learn how to build computing hardware that more closely matches the machine models assumed by high-level programming languages and used by programmers.
- Engineering codes in computer systems: the Internet is the latest large-scale engineering achievement of humanity, after international road and train networks, high-speed trains, planes and flight management, high-tech medical equipment and skyscrapers. As in the other fields of engineering, quality, reliability, maintainability and other “soft” aspects of the engineering activity around computers cannot be optimized by technology alone: they need codes and standards that the human engineers worldwide agree to abide to. We must thus develop and teach new engineering codes and quality standards specialized to computer systems engineering to increase the overall safety, reliability and quality of the global network.
- Education for programming language and operating systems engineers: all the IT economy is built upon and requires an infrastructure of programming languages and operating systems. Most of today’s infrastructure is relying on a foundation of system tools (especially OS kernels, C compilers and utilities) created 15-30 years ago when language and low-level system engineering were hot and popular topics. As the collective attention shifted to higher levels of abstraction (Clouds, Apps, APIs), the population of experts able to support and develop this foundation has become relatively scarce and thus increasingly expensive. In order to keep control, accountability, and some level of “agility” over the entire technology stack in the future, we must therefore as a society develop and maintain education curricula that guarantee the continuous production of skilled experts in programming language and operating system engineering.
As a researcher, I am already working on items #2 and #3; as a teacher, on item #4 and #5. I am lucky to work in a country that gives a moderate amount of attention (and thus funding) to items #1 and #2, and I know at least two top-tier institutes in my country that work on item #5 already. How is your research, institution and country faring on these 5 points?
So what do you think? Did I miss something? Is any part unclear? Leave your comments below.