Pangloss Posted April 29, 2006 Posted April 29, 2006 As a CS educator, I'm growing increasingly uncomfortable with the fact that we seem to be moving farther and farther away from traditional programming goals and aiming students solely at client-server, database-access work. Part of the problem, of course, is that that's what most programmers do these days. The vast majority of programming tasks in the business world (which is to say most programming) involves a client application and a database server. There are exceptions, of course, but it's gotten to the point where that's almost an underlying premise of every single project under development in most companies. People don't even talk about it anymore, it's just... assumed. Not that I'm complaining about business programming -- it's astounding how far that has advanced, just in the last few years. How many adds have you seen lately for Sharepoint portal administators? It's a two-bit hack job of a portal app, but it gets the job done, it's ubiquitous, and it's the sort of thing the MBA types get weak in the knees just thinking about. Business services are absolutely gasing the corporate world right now. But what are we losing in the process? The most powerful IDEs today are based on namespace extrapolation and abstraction -- System.category.function. This is great if you happen to be writing an application that will execute in an operating system that happens to live in that namespace. But what if you're not? You might as well be writing in Notepad. Where are the killer new tools for writing system software? Where is Visual Studio .GAME? Most traditional colleges and universities that have CS programs are suffering these days from lack of enrollment. Everyone's going into the applied side of computer programming. You guessed it -- working in managed code. One of the thing that's interesting about that to me is the fact that "Computer Science" should actually not even be about programming -- it should be about science. CS majors should be our foundation for the inventions that give us a future. Instead it's become the last resort for producing unmanaged code gurus. But where else are the unmanaged code gurus going to come from? They're not going to come from the burgeoning "CIS" programs -- those are all producing managed code experts. And they're certainly not going to come from the trade schools or the business schools. And yet unmanaged code is still critically important to society. We don't really want to have to install Windows on every elevator just so we can run the program we need to make it go to the right floors, do we? What do you all think?
DV8 2XL Posted April 29, 2006 Posted April 29, 2006 Replace "programmers" with mechanics, or any number of skilled tradesmen that are being replaced by "standard processes and procedures" and join the club. The idea that skilled people, trained and then developed under the watchful eye of an experienced practitioner for several years, can be replaced by any random sod working from a three-ring binder is not just endemic to programming.
Pangloss Posted April 29, 2006 Author Posted April 29, 2006 That's an interesting point. As the tools get easier to use, "programming" becomes more of a routinely-trainable skill, and "programmers" become a more routine commodity. Specialization would seem to become more important. We already have a situation where "programmer" is not a sufficient appendage -- you need to know what kinds of programs the person has written. Maybe we need a new word, separate from "programmer", to distinguish the kinds that understand the theoretical underpinnings of programming from the simple client-server, managed code, increasingly wizard-based environments that people are routinely "programming" in today.
the tree Posted April 29, 2006 Posted April 29, 2006 Developer? Person-who-apreciates-the-brillance-that-is-Notepad? (Pwat-B-Tin for short)
DV8 2XL Posted April 29, 2006 Posted April 29, 2006 Specialization of this nature is for insects. We're heading down the merry road to hell with this attitude. Skills once lost by a field are damnably hard to replace as much of the craft-technique is empirical and is rarely recorded. This is part of a broader issue that began when the apprenticeship system (that had only worked for the past 1000 years) was dumped in favor of trade schools. While it is true that some trades have gotten so complex that a good grounding in the classroom is needed, there is no way that this can replace shop floor training. The problem has been compounded by a number of other factors as well, in particular a general belief that the trades are a dumping ground for those that can't cut it academically. Problem is that due to the rising complexity I alluded to above, marginally literate, and marginal numerate people are poor candidates for the skilled trades ether anymore. Unless we find and train new skills before the Boomers (the last group to get a traditional apprenticeship) retire, we are going to be in a world of hurt.
drochaid Posted May 2, 2006 Posted May 2, 2006 Speaking as someone with a degree in Computer Studies (note, not science .. more analysis/business oriented) and managing director of a small IT Services company in the UK, I have mixed feelings about what people should be taught in school/college/university. I find it utterly pointless for courses to even attempt to keep up with whatever the current fad in the computing world is, it changes faster than a course can be written. What I would be looking for perhaps comes close to your own thoughts, you can let me know if I seem to have got your emphasis wrong. I believe people should be taught how to program, or rather, become software developers. Or how to do, whatever the particular field they're studying is. Once they have that level of knowledge, the particular method and language required should be a simple matter of a language reference and a little time to familiarise themselves. I would much rather hire someone from Uni on the basis of general skills that I can develop than very specific skills that may be of limited time use to me. It just doesn't make business sense.
Cthulhu Posted May 2, 2006 Posted May 2, 2006 As a CS educator, I'm growing increasingly uncomfortable with the fact that we seem to be moving farther and farther away from traditional programming goals and aiming students solely at client-server, database-access work. Part of the problem, of course, is that that's what most programmers do these days. The vast majority of programming tasks in the business world (which is to say most programming) involves a client application and a database server. There are exceptions, of course, but it's gotten to the point where that's almost an underlying premise of every single project under development in most companies. People don't even talk about it anymore, it's just... assumed[/i']. The reason is that businesses have always really needed software engineers and not computer scientists. Computer scientists have generally left university and found that the theorectical concepts they have learnt are not what most companies are after. For many companies they just need existing products to be strung together. Now software engineering and computer science are really diverging, and I agree with you that this is largely because the business specific tools are getting more advanced and higher level and don't require good knowledge of computer science to use. But what are we losing in the process? The most powerful IDEs today are based on namespace extrapolation and abstraction -- System.category.function. This is great if you happen to be writing an application that will execute in an operating system that happens to live in that namespace. But what if you're not? Exactly. The specificness of .Net is one thing that annoys me about it the most. The other thing is how microsoft butchered c++ Most traditional colleges and universities that have CS programs are suffering these days from lack of enrollment. Everyone's going into the applied side of computer programming. You guessed it -- working in managed code. This is understandable. Unfortunately businesses are after general knowledge of concepts like XML, common databases and encryption and especially their application to businesses. Businesses are generally not after knowledge of more theoretical and acedemic stuff like different fields of AI, compiler theory, the real details of how databases work, etc. There are far more jobs in the software engineering side of things and as computer science and software engineering diverge I can imagine more students going for the software engineering if it offers more job prospects. Not that computer science will go extinct - there will always be a demand there, just not quite as much. One of the thing that's interesting about that to me is the fact that "Computer Science" should actually not even be about programming -- it should be about science. CS majors should be our foundation for the inventions that give us a future. Instead it's become the last resort for producing unmanaged code gurus. I agree that teaching a specific progamming language like c++ or java shouldn't be part of a computer science course, but teaching generic programming concepts like OOP should. If I had my way I would make good knowledge of programming a pre-requisite of CS courses rather than currently the courses having to teach it from scratch. But where else are the unmanaged code gurus going to come from? They're not going to come from the burgeoning "CIS" programs -- those are all producing managed code experts. And they're certainly not going to come from the trade schools or the business schools. But then I don't know what % you would put it at but I would say probably only about 10% of Computer Science graduates could get a job in programming and keep it. Most were simply not interested enough in programming to use it in their spare time which is the key to being proficient in it in my opinion. And yet unmanaged code is still critically important to society. We don't really want to have to install Windows on every elevator just so we can run the program we need to make it go to the right floors, do we? Perhaps you are using the phrase "managed code" differently from how I am accustomed to. I am using it in relation to .Net where managed code is safe due to running on a virtual machine with a garbage collector. Unmanaged win32 code can still be written but microsoft and industry are moving away from it (which is sad in my opinion). Then again the embedded device industry is growing and is more reliant on unmanaged code (when they aren't using java)
RyanJ Posted May 3, 2006 Posted May 3, 2006 I agree with the OP, its bad in some ways as it can be said that the programmers are not learning "real" code, they are letting the program do all the work for them esentiallly. In the end though it does encourage more people to program and in the end it is surprising how many people actually turn to learn the C languages after learning something like VB.net or the like. MS is now strongly trying to endorse the use of .NET based languages such as C#, VB.NET, J# etc. Luckily there are people trying to port this for Linux and Mac so one day they may have complete cross-system compatability. Cheers, Ryan Jones
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now