Interaction between Academics and Industry

joeframbach

Limp Gawd
Joined
Jun 22, 2005
Messages
284
I have a paper/presentation due this Monday, and so far I have something of an introduction and have worked out the topic of the paper. I would like to share it with the forums here. Some feedback would be awesome!

Industry is concerned with practical, physical applications, and Academia is concerned with theoretical, metaphysical concepts. This statement may hold in fields such as Computer Science, Physics, and Mathematics, but the scope of the paper deals with Computer Science. Industry likes to use state-of-the art technology while applying old simple techniques. Academia uses archaic technology while developing smarter algorithms and techniques. There is some conflict in that Industry must re-train those coming from Academia, and this is very costly. What can be done to reconcile Industry and Academia? Vocational schools, internship programs, and academic grants help, but there is a deeper solution. I need to expand on the solution in this paper.

The Full Text said:
The Reconciliation of Academia and Industry in Computer Science
Academia and Industry are two separate nations caught in an epic battle for the survival of their inhabitants. This battle has been waged for centuries, and has caused the deaths of many of its soldiers. In the 17th century, Galileo fought for Academic progress by proposing that the Earth revolved around the Sun. The Industry at the time was the Church, as it was the only group that developed physical laws. In the end, the work of Academia had imposed changes to the beliefs of the Industry, although Galileo had been incarcerated and silenced for his actions. The current age has a different Academia and a different Industry. Academia is now built solidly on higher education and theoretical work, and Industry is built on financial progress and real-world applications. Throughout the history of this interaction between Academia and Industry, many have attempted to bring together the two nations in harmony, through the use of vocational schools, internship programs, and academic grants. However, there are many issues that are not yet understood, so the two camps will remain separated.

Why Computer Science?
Computer Science is a field where there exist two very separate worlds, the Practical and the Theoretical. Practical Computer Science focuses on the development of end-user technology, such as web browsers, operating systems, and electronics firmware. Theoretical Computer Science focuses on the development of concepts that are not usually seen by the end user, such as programming language semantics, algorithm analysis, and automata theory. This paper would easily extend into the fields of Physics and Mathematics, which are other such fields of study where such a division exists. In Physics, much work is done with theoretical concepts such as quantum theory, and much work is done in practical applications such as aerospace engineering. This paper does not apply so well to the fields of Biology or Chemistry, where work is done in a mostly practical level, or to the fields of Psychology and Philosophy, where work is done in a mostly theoretical level.
This separation between Practical and Theoretical worlds is directly correlated to the Industry/Academia separation. Most theorists are found in Academia, and those interested in physical applications are found in Industry. In Computer Science, Engineers are usually found active in the workforce, and Theorists are usually found active in the classroom. As any scholar of Engineering would confess, the study of Engineering focuses on the already-existing world and its practical applications. Conversely, a practitioner of Philosophy would say that their work focuses on metaphysics and theory. Computer Science, however, does not focus solely on theory or physical results, nor is any majority of its practitioners found in either the workforce or the classroom. Scholars of Computer Science’s Academia are not out to describe pre-existing phenomena; they are at the forefront of theoretical work, creating new ideas and new laws to govern their new world. The workforce of Computer Science’s Industry, however, does not rely so much on theoretical progress; it thrives on physical processes and results.

What is the discrepancy?
Academia and Industry complement each other in functionality and purpose. The fact that they have different goals leads to conflict, as well. This shows very clearly in what happens to information and research in the two camps. Academia tries to spread it far and wide, while Industry tries to hoard it for financial gain. So while Academia needs the financial support of Industry, and Industry needs science, technology, and research from Academia, there is some tension when trying to decide what to do with the products of collaboration.
Industry sees little need for theoretical work; it has no practical applications in the “real” world. As long as applications “just work” and development time is kept to a minimum, the intricacies of theoretical algorithm development hold little value. However, the theoretical approach taken by Academia does hold much value for those interested. Efficiency and complexity analysis would, on a large scale, improve performance in real-world applications. Therefore, this singular problem becomes two-fold: Academia does not produce useful physical results, and Industry does not utilize new theoretical ideas.
Aside from the Practical/Theoretical divergence, another very closely related issue involves a disagreement of the use of technology and conceptual ideas between Academia and Industry. This disagreement takes place mostly in the realm of Industry, as the actions of Academia directly affect Industry, but the actions of Industry do little to sway Academia. Specifically, Academia is more inclined to make use of older technology while creating new conceptual ideas, and Industry is more inclined to make use of newer technology while implementing older, tried-and-true concepts. To further understand this discrepancy, the difference between technology and conceptual ideas will be discussed. The technology in question, for example, includes software packages and programming languages. Industry is more inclined to use faster software and more efficient programming languages to reduce development time, because physical results need to be quickly acquired. Professors in Academia, however, are not quick to adopt these changes, because new languages arise very often, and the differences from one version to the next are not significant to the development of conceptual ideas. These conceptual ideas include efficiency, optimization, and scheduling techniques. The fundamental aspect of how a data structure or algorithm works is mostly language-independent, and can be effectively taught with archaic programming languages. The workforce in Industry, is not concerned with the structure of an algorithm built into their programming language of choice; fundamental algorithms are implemented by Industry, and rarely developed therein.
This framework of Practical/Theoretical discrepancies and Technological/Conceptual subdivisions allows the problems in reconciling Academia and Industry to be fully understood, and allows solutions to be developed and proposed.

Filler.
[This section yet to be written.]

What can be done to reconcile the two nations?
Collaboration: The two nations must build a bridge, an eight-lane super-highway. The fact that it is very difficult to move from Industry to Academia will not change, so Industry must move to Academia. In higher-level undergraduate courses, more Industry-sponsored projects should be available. Training for such positions must be available as well. In recent years, this idea has become more popular. [This section needs to be expanded.]

Bringing Industry into the classroom
Many professors have moved from industry jobs into teaching. One common argument in this discussion has been that “real life examples” help to reinforce the importance of various concepts such as de-allocating allocated memory. Another argument is that students, due to their natural sense of mistrust, are often skeptical of their instructors. For this reason, many instructors find it beneficial to invite Industry professionals to reinforce classroom material with anecdotal evidence.
At Pitt, we offer a course in Software Design Methodology, CS1631. Various representatives from local industries come to the classroom and present projects to groups of 4-5 students. The students must collaborate and use a model of software development to complete the project. This course is a huge leap into the real world for a lot of students, and it is entirely beneficial for both the students and the Industry representatives. The point is that it is far easier for representatives from Industry to come to Academia to propose ideas, than it is for representatives from Academia to go to Industry to propose changes in the way they do their work.

Formal methods in practice
There have been numerous formal methods of software design developed in Academia, such as formal state-based specification languages, event-based process algebras, and formal frameworks. These methods hold much promise in theory, and have been proven to be useful and beneficial in practice, as shown in classrooms such as CS1631. However, it is rare to see such methods fully implemented in Industry. The software industry is not reluctant to adopt these techniques, but is held back by a lack of resources for full implementation. For these methods to be successful it must be sufficiently scalable, and Industry must have access to specialists with an extensive background in mathematics, because the methods were rigorously defined using extensive mathematical concepts. In practice, it is rare that Business Analysts and Domain Experts are adequately knowledgeable in the formal method to be able to participate in making or validating the formal specification.

Any relationship that is not mutually beneficial will quickly collapse
[This section yet to be written.]
 
Also, may I have your responses to the concept of Industry in the Undergrad classroom? At Pitt, we offer a course in Software Design Methodology, CS1631. Various representatives from local industries come to the classroom and present projects to groups of 4-5 students. The students must collaborate and use a model of software development to complete the project. This course is a huge leap into the real world for a lot of students, and I view it as entirely beneficial (no doubt in that).

The point I would like to make is that it is far easier for representatives from Industry to come to Academia to propose ideas, than it is for representatives from Academia to go to Industry to propose changes in the way they do their work. Do you agree with this? Somehow I expect there to be some problem with this point, but I just can't put a finger on it.
 
The point I would like to make is that it is far easier for representatives from Industry to come to Academia to propose ideas, than it is for representatives from Academia to go to Industry to propose changes in the way they do their work. Do you agree with this? Somehow I expect there to be some problem with this point, but I just can't put a finger on it.

I don't think so, but it depends on what you specifically mean.

Any academician I've talked to who has some suggestion for changing software engineering methodology is completely out of touch with reality. I walked out of a Leslie Lamport talk when he said that engineers should write mathematical models of every function they write. This kind of thinking would grind any hope of progress to a halt in any commercial application.

Most academicians also miss the problem of scale. I'll hear about some interesting idea for software development management, then ask how it scales. "Huge!", the expert will say; "we've used it successfully on teams of over 200 people!" Projects like Windows have more than a thousand developers; SQL Server as many hundreds. Are these large (and hugely successful) project to remain neglected by the researchers?

Or do you mean just moving technology and ideas about various areas of research to production? This is very easy: engineers should be reading academic papers at all times, and be familiar with the research in their area of expertise. I don't expect that the run of the mill web "developer" is doing this, but engineers who are serious about implementing usable technologies should be.

By the way, are you researching your paper, or just writing your opinion? I'm asking because I don't see any references.
 
Thank you for responding.

The problem with scalability of software design is a problem faced by academics. If I could find a scenario or two where this has actually been a true problem in industry, it would be wonderful for the paper. I'll look into that tomorrow.

I pulled the whole theory/practicality thing out of my ass about a week ago, and then went to researching the parallel academia/industry collision. From the articles I have read, most past research has been done in the field of software design methodology, and the authors usually praise the success of these designs. I am trying to not be so specific, focusing solely on software design. I am writing about the interaction between academia and industry on a fairly general level.
I'll spend a page, maybe a page and a half, discussing the pros and cons of software design. It is only one facet of the paper.
 
The problem with scalability of software design is a problem faced by academics. If I could find a scenario or two where this has actually been a true problem in industry, it would be wonderful for the paper. I'll look into that tomorrow.
I've already given you two examples. It's not hard to find others: some of the best-documented failures of extreme programming, for example.

I'll spend a page, maybe a page and a half, discussing the pros and cons of software design. It is only one facet of the paper.
What are the cons of software design? Are you suggesting that software shouldn't be designed and should just somehow happen?
 
I've already given you two examples. It's not hard to find others: some of the best-documented failures of extreme programming, for example.
I'm looking for documented examples, not anecdotes. The examples you did give are helpful, though. XP hadn't occurred to me, although I wrote a paper on it last semester. I seem to be losing my mind.

What are the cons of software design? Are you suggesting that software shouldn't be designed and should just somehow happen?
A con of software design is that it is difficult to create a method of design which is scalable. I am not saying it is impossible, just difficult. Difficulty shouldn't stand in anyones way, though. Perhaps I should retract my statement about writing about the cons of software design.
 
I'm looking for documented examples, not anecdotes. The examples you did give are helpful, though. XP hadn't occurred to me, although I wrote a paper on it last semester. I seem to be losing my mind.


A con of software design is that it is difficult to create a method of design which is scalable. I am not saying it is impossible, just difficult. Difficulty shouldn't stand in anyones way, though. Perhaps I should retract my statement about writing about the cons of software design.

Forget XP, look up the intial people that tried to work out a large scale application for the census calculations. It took a lot of money, a lot of spread out teams, and a lot of other stuff that ended up uncoordinated, incompatible, and flat out disgusting. It was a huge flop. Not too long afterwards a very well planned application started from another company and it went like clockwork compared to the previous developers and teams of developers.
 
Forget XP, look up the intial people that tried to work out a large scale application for the census calculations. It took a lot of money, a lot of spread out teams, and a lot of other stuff that ended up uncoordinated, incompatible, and flat out disgusting. It was a huge flop. Not too long afterwards a very well planned application started from another company and it went like clockwork compared to the previous developers and teams of developers.

I am having trouble finding any information on the example you stated, but I did find this website which is very informative. http://www.lessons-from-history.com/Level 2/Project Success or Failure.html

[EDIT] Is this what you were referring to? http://www.oag-bvg.gc.ca/domino/reports.nsf/html/20061103ce.html
 
There's one blindingly obvious example of very large projects - the open source world (specifically, the Linux kernel). It doesn't really fit into either camp, but I'd say that any comparison of software design approaches should at least give it a nod; whether or not you conclude the it's good, such distributed development and its management would certainly give you significant discussion material. It also has the benefit (to you) that it's well documented and discussed already ;)
 
I have a few issues with your paper, but not many suggestions just yet. I'll stick to constructive suggestions until I can come up with specifics.

First, (this has already been said) you make many broad statements about industry and academia without references or supporting discussion. For example:
Industry likes to use state-of-the art technology while applying old simple techniques. Academia uses archaic technology while developing smarter algorithms and techniques.

I'm not sure how you'd back this statement up at all. I can think of several counter examples right off the top of my head. For one, I wouldn't consider the NCSA's technology archaic. Here is another:
fundamental algorithms are implemented by Industry, and rarely developed therein

Perhaps you could tabulate the number of CS- and algorithmic-related patents from universities and companies to support this statement. I don't know how the numbers would fall out, but I seriously doubt you could consider algorithm development "rare" in industry. Here is an example of a complex algorithm developed in industry. (Yes, there's a little boasting going on - the algorithmic technique was invented by PhD's at the company I work for. Some parts were invented while they were grad and post-grad students; other parts came later. Some newer patents detail entirely new ideas.)



Second, your paper seems to equate "academia" and "freshly graduated college students". For example:
There is some conflict in that Industry must re-train those coming from Academia

I imagine a 40 year old post-doc who has spent all his/her time in a university research lab getting hired by a private company. Late in the paper you suggest changing undergraduate courses to fix the problem you've raised. If your paper is about the disconnect between industry and college graduates, cut the word "academia" and the discussion about theoretical research. Discuss the undergraduate curriculum instead.

If you're really talking about academics, drop the undergrad discussion and find a PhD who has made the transition to industry. Interview him/her and his/her manager and co-workers.



Third, the description of the solution is contradictory. You state:
The fact that it is very difficult to move from Industry to Academia will not change, so Industry must move to Academia.

It's hard to move to academia, so industry must move to academia? Huh? And you say undergrad courses have to change to "move to academia"? That sounds like undergrad programs are changing to meet industry.



It feels like you've come to a solution before you even started researching WHAT the problem is, so you're spending a bunch of time trying to convince the reader there is a problem.

There is a disconnect between academia and industry. There is training that occurs when college grads find their first job. My best suggestion is to pick one of those two subjects and do some research. Interview some people. Find out what was difficult for a recent grad at his/her first job. Ask a software manager how they bring new people in and teach them. Talk to a post-doc who made the switch from a university.
 
While I get the gist of what you are after, I think that it is important to distinguish between two forms of academics: academia in the classroom (teaching) and academia as research. Academia in the classroom is something that should teach concepts, which can be used to understand and apply the technologies, quite frankly if you want to be working with bleeding edge without understanding where they came from and how they work you should be at a specialty school and *not* in a Computer Science program. Academic research is typically funded by grants whose money can be traced back to industry in one way or another. Algorithms, software designs, and concepts developed there may not be precisely what is used by the industry, but can help shape things over time. This doesn't mean everything in academia is useful, but people are trying.

One of the PhD's in the research group I was a part of ended up working at nVidia if I recall correctly, not quite sure what he does now. Also, shameless plug: http://www.apple.com/science/profiles/hiperwall/
 
It feels like you've come to a solution before you even started researching WHAT the problem is, so you're spending a bunch of time trying to convince the reader there is a problem.

You've hit the nail on the head. I'm tired of this class and just want to be done with it. There's a page quota to meet, and I'm looking for meaningless drivel.
 
You've hit the nail on the head. I'm tired of this class and just want to be done with it. There's a page quota to meet, and I'm looking for meaningless drivel.

Perhaps what you should write about is how academics often don't give a shit. They aren't getting paid and don't tie their success in school to being successful in life. On the other hand, people in industry get paid in proportion to their value (ideally, anyway), and therefore are less likely to mail it in.

Further, you might posit that academics probably don't need an eight-lane super-highway on their bridge. They just need a whipping post, so the slackers can get a flogging.
 
You've hit the nail on the head. I'm tired of this class and just want to be done with it. There's a page quota to meet, and I'm looking for meaningless drivel.

Ah, ok. I understand. I had my share of that in school.

Instead of concluding your school's approach fixes this "problem", maybe you can investigate if these industry collaboration classes are effective. Focus on the preparedness/unpreparedness of college graduates entering the workforce. Draw a conclusion based on what you find.

It should be easy to get time to interview the people teaching these classes. It will be more work than what you've done, but I don't think it will be difficult or time consuming. Maybe you know some recent grads. If not, you could go to the alumni office to find some. Most colleges have an "employment center" - use that to find a local business willing to talk to you about hiring recent grads.

People love to talk about what they do, and people love to complain. People really love to complain about what they do. It should be easy to fill a bunch of pages with that. :p Plus, you'll have references all ready to go.
 
Further, you might posit that academics probably don't need an eight-lane super-highway on their bridge. They just need a whipping post, so the slackers can get a flogging.

LMAO

I've worked with those academics. I've also worked with ones that need a flogging so they'll stop working, go home, and have some time for themselves instead of burning out every few weeks.
 
You can PM me with questions, if you want to. I'm currently a college intern @ Huntington National Bank, so I may be able to provide some insight to what you're writing about.
 
I've worked with those academics. I've also worked with ones that need a flogging so they'll stop working, go home, and have some time for themselves instead of burning out every few weeks.
Me, too. I really enjoy helping the latter group with their papers.
 
Back
Top