Original Link :


How did you get into electrical engineering and when did you start?

Interestingly, I actually am not a electronics or electrical engineer. All my degrees are in Computer Science. In India, in the last year of your high school, you have to take some competitive exams to enter the engineering colleges, especially back in 1987 when my state had only 7 engineering colleges. So I ranked 11th in the state – and in those days, the top 20-30 would be able to opt for whatever branch of engineering they want, and at the best college among the state engineering colleges. So I opted for Computer Science and Engineering. However, in those days, most of our professors were from Electronics and telecommunication engineering, as Computer Science as a discipline was new. So our syllabus was full of courses that normally an electronics engineering program would have – electronic devices, Electrical Circuit Analysis, electronic circuits I and II, Digital Circuits, Transmission Lines, Control Theory, electrical machines, electrical instrumentation, and number of Physics and mathematics classes. Along with these, we had digital design I and II, Microprocessors, Computer Organization, Memory and I/O, Microprocessor based system design etc. Programming was not emphasized that much as computers were expensive – labs were always full with students – and so we had to write programs first by hand on paper – and then mentally execute and debug – and finally run if there is computer time available. So when we studied courses such as Data Structures, Numerical Methods, Programming Languages, Assembly Languages, Operating Systems, Database Management, or Compiler Design – we did more of programming on paper – rather than on computer. None of my classmates could afford a computer at home – emails were not even there – Internet was something we heard about – never used and so on.

As a result, our electronics knowledge became better than our programming knowledge – but I did not like that much. My favorite subject was mathematics, which led me more towards theory of computation, semantics of programming, Axiomatic Proof methods etc – rather than engineering or designing real systems. I wanted to live in the world of abstraction rather than that of the concrete – not a good idea for an engineer – but very good for a computer scientist. As a result, when I came to the US for my graduate studies, I went to a rather unknown school with a number of very well known theoretical computer scientists – one of them was the editor in chief of the Journal of ACM, another one received the Turing award in 1993, and so on.

Even after my PhD, I went to work on software at the GTE Labs for two years – GTE later became Verizon after a couple of big mergers. In 1999, however, I went to work for Intel – joined the formal verification group working on the formal verification of parts of the Itanium chip (at that time the name Itanium was not coined – so it was code named Merced). I was hired based on a day or puzzle solving, and questions on my favorite topic – formal verification, temporal logic, binary decision diagrams and so on. However, as soon as I joined, I realized that I have to recall my knowledge of computer architecture, digital logic, protocols, and so on from my undergraduate days. Furthermore, I had to learn Verilog, VHDL, and Intel’s own iHDL – along with tools for simulation, dynamic tests, coverge tools, and so on. So I came back more into engineering – and my days of living in the abstract world of Kripke structures, temporal logic, Horn logic, and NP-completeness were now merged with the world of the concrete engineering of world’s first IA-64 architecture implementation. This was a wonderful experience with a very talented group of five formal verification engineers who all learnt the hardware at their jobs. It was a great experience. However, 6 months later, Merced taped out, and I moved onto a 5th generation project – a new microarchitecture to be designed – the initial exploration team for such an exciting project with about 20 top people from Intel, Santa Clara. Eventually, 6 months after that – Intel decided to cut that from their road map – but during this time, I started working on System Level Modeling and Design – and new languages of that time, SystemC, and various alternatives for making design more high level and so on. It was a great experience learning about system level languages, and design – and t he hope of one day getting tools that could synthesize a complex processor from a very high level functional description with constraints. After this was cancelled, I joined an group called the IA-64 Platform architecture, and started looking at bus protocols, their formal specification, and automatic synthesis of verification checkers that could be distributed to OEMs. It was my first experience writing a compiler – but it was fun as long as it lasted – but I was getting more interested in advancing the research agenda of system level design languages, and synthesis techniques —- than doing day to day development of tools – so I eventually quit in 2001 – and went on to become a research associate with Prof. Rajesh Gupta at University of California at Irvine. At this time, I changed my mind about industry, and wanted to be in Academia. But I needed to add to my publications list as academia is a lot about publication – and so during the year and a few months I was at Irvine, I worked on various topics, system level power management, simulation improvement of SystemC models by concurrent execution, and compositional design based on existing components – and so on. Publications were coming very fast – so then decided to join a tenure track position – which happened in 2002 when I joined the Virginia Tech electrical and computer engineering department as an assistant professor. 10 years later, I have now become a full professor – graduated 9 PhDs, and about 10 MS students with thesis research, supervised 4 postdoctoral fellows, supervised more than 20 undergraduate research projects, and advised some foreign PhD students who visited my lab for collaborative research. So now, I feel like I have made some contribution to the society in terms of educating the next generation – and that is a very good feeling. Educating and inspiring the young minds and nurturing the creative minds is something of a blessing in my view.

In the meantime, my research has shifted from System Level Design to Software Synthesis, as well as to Smart Grid, and Cyber Security and so on. I do not build artifacts like an engineer any more, but my students do – mostly software artifacts, and theory of design. So my main product these days are articles for journals, conferences, magazines, and books. This is rather a strange path to take for an engineer – the path of a writer – but I do not feel like a writer in the traditional sense – these writings are about the results of our thoughts, experiments, and the formulation of design methods, and theory supporting such methods. So it is still engineering – but instead of creating artifacts that can be directly used or sold – we build prototypes which are then can be demonstrated to prove an idea – but if it has to be used in the real engineering – they must be reengineered with robust practices – which we do not have the time or funding for.

What are your favorite hardware tools that you use?

Do you mean hardware design tools? If you talk about hardware as tools, only things that I and students in my lab use are computers (these days multi-core), FPGAs for some prototyping or experimentations, external storage devices, printers, scanners, and our androids. In terms of design tools, we use SystemC, Verilog simulators, we use some FPGA synthesis tools, some power estimation tools for RTL, and in some projects we have used Bluespec tools.

What are your favorite software tools that you use?

Software we use mostly are compilers (C/C++ mostly), scripting languages, Java, Eclipse, or Visual Studio, GME (a generic modeling environment developed at Vanderbilt) etc. Other than that, in our embedded Software Synthesis work, we use SIGNAL compiler, and Polychrony toolset (developed at INRIA, France), Quartz compiler, and Averest toolset (developed at the University of Kaiserslautern), and our own MRICDFcompiler, and EmCodeSyn environment for synthesis of embedded software. For SmartGrid simulation work we used NS2, OPNET, and PSLF power system simulator, and we developed our co-simulator.

What is the hardest/trickiest bug you have ever fixed?

This is hard to answer – as my real industrial engineer career was only four years (2 years at GTE labs) and 2 years at Intel. Most of the other time, I have been usually managing development as supervisor – and not as developer. So bug fixing I have done have not been too hard – except when I was at GTE our software was actually deployed at the Network Operations Center, and hence I had to fix many bugs – but nothing stands out in my mind as particularly tricky or difficult.

What is on your bookshelf?

Too many books – but I am now very interested in nonlinear phenomena, synchronization, and complex systems theory, emergent behaviors etc. So some of the favorite ones I have on my shelf are Strogatz’s ‘nonlinear Dynamics’, and ‘Sync’ — yet to be read; Barabasi’s book ‘Linked’ – which I have read a few times.

Do you have any tricks up your sleeve?

What kind of tricks? Engineering tricks? Not that I can call a trick. Wisdom – yes I am old now – so some wisdom has trickled through the experience – one of them is hard work is essential for any success. .

What has been your favorite project?

My current favorite project is a project I am doing for the Air Force Labs on automated synthesis of safety-critical software. But my favorites change from time to time – so 5 years down the line – if you ask me – I will probably have another one.

Do you have any note-worthy engineering experiences? (Great accomplishments, awards, etc.)

Awards – quite a few – but I am not sure – if they are for engineering experience or not. At GTE labs I got an “Excellence Award” for a successful deployment work, but in Academia I received a number of awards. Here is a list:

PECASE (Presidential Early CAREER Award for Scientists and Engineers, 2003) and NSF CAREERAward (2003-2008)
• Friedrich Wilhelm Bessel Research Award from the Alexander Humboldt Foundation of Germany, 2008. (A Life time achievement award for Scientists for their achievements within 12 years of their Doctoral Degree)
• Distinguished Alumni Award, State University of New York at Albany, 2007
• Invited by the National Academy of Engineering for the Frontiers of Engineering Conference in 2005 (only 88 scientists and engineers nationwide invited)
• Invited by the National Academy of Sciences for the Frontiers of Science Conference in 2007 (Only 100 Scientists below age of 45 were invited)
• Invited by the National Academy of Engineering for US-German Frontiers of Engineering Conference in 2009 (only 60 scientists and engineers from US and Germany together were invited)
• Virginia Tech College of Engineering Faculty Fellow (2004)
IEEE Computer Society Distinguished Visitor (2007- present)
ACM Distinguished Speaker (2007- present)

Do you have an experiential stories you would like to share?

When I was at the GTE labs, one of the project we did involved fair distribution of task among various network operation center workers – whose computer screen will show the tasks of taking care of certain network problems from among all the network problems at that time. This required a round robin algorithm for distribution of these tasks in a fair manner to about 30 operators, but we also have to ensure that they do not squat on a task too long in order to avoid being assigned more tasks, and they receive tasks that they have some experience with, and also credits were to be given for those who successfully discharge their tasks. So this was kind of a constrained optimization based scheduling problem – which I solved in heuristic manner – and then programmed in on our surrogate system at the lab. However, I implemented it for one type of task only, and other types were to be added later. But I forgot about that, showed it to my team leader who also did not notice that they were all the same type – and he actually went to the NoC in Dallas to install it. After installation – it was chaos at the NoC because no one was getting assigned any other types of tasks – and he had to roll back – and he was very upset at me. So I offered to fix it – but he fixed it himself – and while fixing he realized it was not straight forward because of fairness constraints – as different types took different amounts of efforts by operators to discharge. I am still ashamed for my mistake – but in my defense – I actually moved onto a new project by that time when it happened – and I was experimenting with a software architecture for next generation tools for network management using middleware – and event services, notification services and so on – I was too excited with the new technology I was playing with to bother about that old C/C++ implementation. However, I would not advise any one to neglect the software that is now at the NoC and play with new technology just because they are cool. I think I was too immature at that time.

What are you currently working on?

I am focusing on three disparate projects now:

1. Cyber Security of SCADA systems
2. Correct-by-Construction Software Synthesis from Formal Specifications
3. Designing domain specific languages related to Cyber security (but I will not elaborate on this one)

What direction do you see your business heading in the next few years?

The answer to this one depends on what is my business. A few years ago, I was deeply involved in the Electronic Design Automation community, and I was working with companies in that field, and that was kind of my business. So the outlook for that business is not very good at the moment. Especially in the front-end. 10 years ago there were much excitement about system level design (SystemC, SystemVerilog, System synthesis tools, Behavioral equivalence checking tools etc) but now they have kind of settled to the idea that those system level design languages can be used for simulation (especially transaction level), but the dream of synthesizing an optimal system-on-chip from such high level is no longer dreamt – and the excitement is lost. The backend is going strong but the innovation has been trickling in that rather than gushing. The EDAindustry is also small in the number of dollars, possibly less than 20B dollars, so it is hard to sustain a lot of innovation. Compared to that lot more innovations are happening in machine learning, data mining, and so on – as the market where google, facebook etc plays is huge.

However, my business these days is safety-critical embedded software, and cyber security—and lots of innovations are needed there – it has not reached the diminishing return part of the curve – so it is an exciting time for these fields.

What challenges do you foresee in our industry?

The industry of system design
Computational fabric is becoming more and more parallel with more cores being integrated, thanks to the Moore’s law – and the power wall. However, utilizing the massive parallelism afforded by these requires new innovations in parallelizing applications, and ability to exploit the parallelism without challenging the programmers are abilities too much. Thus automated parallelization of some kind or new programming models and new way of thinking is needed. I do not see that happening as fast as we need it. The GPUcraze is on – but not all problems can be benefitted from GPU implementation as GPU depends heavily on SIMD. So characterization of problems that are benefitted by GPU is a large factor is needed. Of course, the memory architecture, onchip communication bottlenecks are also challenges.

There is a rush to capture the mobile market – Intel spent a lot of resources on atom but I think large scale computing is still needed, and intel should focus more on that – as data centric and analytical approaches to society and life are gaining momentum, large amount of data crunching ability in parallel is needed. So we need Intel to focus on that, and leave the mobile market to those who can do it better. If needed Intel should spin off the mobile part, and focus on computing power.

I do not see EDA industry grow too much – unless there is a break through, and a disruptive technology.
Big data analytics, data mining, machine learning, artificial cognition – these are the topics du jour. These have gained momentum with the societal needs, and need a lot of talent to work on them. Those who have worked on these on the past often were not up to the task.
Safety-Critical Embedded systems for avionics, mission –critical systems, real-time systems – automotive, medical systems, and so on will be very important, and increasingly challenging – as more innovations for automatic correctness guarantees are found.

Cyber security, cyber dominance, and cyber warfare – are the frontiers of our technology, and frontiers of next wars. So I would be watching these with interest and possible work on these.