I think a better alternative would be "Don't Make Students Use Java", I seriously can't think of a worse language for training beginner/intermediate programmers, I certainly wish my schooling experience consisted of something else. From day one you're introduced to magic on top of magic: "Open up <IDE> and click 'new project' and name it 'MyProj' and now click on the file in the side menu that says 'MyProj' and inside `public static void main(String args[]){}` inside the `public class MyProj{}` write `System.out.println("Hello World")` and go to the top bar and hit 'Run' and then 'Hello World' will appear in the integrated console, ah, programming!"
And you don't even begin to grasp what all those steps you just took in the IDE did until you're at least 1+ years in, and you don't learn what all of those magic words even remotely mean by at least halfway into your second semester.
I felt like every single thing I ever did was magic, never having a clue why it ever actually worked the way it did, and as a result nearly dropped out. The only time things _finally_ started to make sense is when I got to Computer Architecture/Organization with MIPS followed by some work in C.
I still feel like C# is a better Java. It has most of Java features, has other features, implemented important features before Java and it doesn't reach Java level of verbosity and bloatness.
I used to feel that way a few years ago, but my opinion has shifted over time. C# has its own issues with verbosity and bloat. For example, although I appreciate the readability increase from overloaded comparison operators, it's a lot of boilerplate to write a class that implements IEquatable<T> and IComparable<T> and overloads all of the related operators.
Here I disagree. I would not start teaching programming in a manual memory language (though that is how I started) because memory corruption is really hard to deal with and completely inconsequential to the basics of programming.
I do think it's important that every programmer should know how to work with raw memory and pointers, but it is a good subject for an intermediate class, after you have the basics of what you are supposed to do while programming.
For a good "what's programming really is" class mobile app development and some web dev would be great. First a mobile hello world. Kotlin preferably. Then they can whip up a very basic backend (it just spews out rows from a DB), again in Kotlin. And then they can slap an admin UI on top in HTML.
And then over the next few years it would be great to dissect this. CPU, ALU, ASM, raw memory, network basics, IP (BGP ~ distance vector routing), TCP, buffers, syscalls. RDBMS, MVCC, B-trees, merge sort, heap sort, and so on.
Oh well, one can dream. But usually students are just bombarded with completely disconnected courses.
I was thinking of higher education. I think it's far more useful to teach algorithm design first and the nitty gritty of computer implementation later. This is, after all, how the field actually evolved. We had algorithms and programs far before we had the first computer.
Java is great for exactly that reason. It gives students an introduction to compilation, runtimes, stacks and heaps etc. Perhaps you had a poor learning experience due to who/what was teaching you rather than Java? My second comp sci course in college (first if you count high school AP credits), taught me all of this, and prepared me for the "full details" of MIPS and C.
My main complaint about Java as a pedagogical language is that it's just so big, with so many complexities around semantics, compilation, runtimes, etc. So my fear is you end up spending as much time learning Java as you do learning how to build software. I wouldn't choose C# or C++, either. My sense, from working with others who came out of CS programs that relied on Java, is that they end up only really knowing how to work in Java, and therefore only really knowing their way around the ways of solving problems that come most naturally in Java.
It could be my chauvinism speaking. I'll admit to having some. My CS program taught us three relatively small programming languages (Scheme first, followed by C, and then MIPS assembly) in the first year before introducing OOP in the second year. So we ended up getting exposure to a variety of programming paradigms and environments, and came away with a lot of tools in our belt, and a fairly deep understanding of programming language theory and paradigms. And I did eventually find myself settled in a Java shop. When I did, I found that within only a couple months I had already developed a deeper understanding of the language than colleagues who had spent their entire careers working in it. And I believe a lot of that is because I had enough background knowledge to distinguish between, "This is how programming is done," and, "This is how programming is done in Java."
Pass by reference / pass by value is so syntactically batshit insane in Java that it shouldn't even be considered as an intro language option.
And before someone points out that, actually-technically, Java is always pass by value, I will pre-retort that reference-as-value isn't an argument we should ever be having about an intro programming language.
Either teach a systems language with unambiguous pointers, or teach something that abstracts consistently.
The only problem here is using the term "reference" for reference variables. If you would say "these are pointers to objects, but you can't actually do any pointer arithmetic", this would be a non-issue. The pass by value is not the problem here.
C and Pascal and even C++ have a very simple model. An address of a variable is a pointer. You can actually understand how memory is layed out on hardware and why you would want to pass references or pointers instead of copying very large data structures.
Learning C first would also introduce students to all of that as well, outside of the magic shell of Java and its tooling; get your compiler, get your text editor, and you are so much closer to the action. If I could have chosen what I started with I would likely have picked C, or possibly a LISP as well.
I went through Basic - > Pascal - > C - > C++ - > many other kinds of languages.
Thanks to C and C++ I did understand the most important bases of programming, how a program runs, how hardware works and the most important algorithms and data structures.
I think it's important to learn something that doesn't do too many abstraction but it isn't too low level like the assembly language.
The point is, it's a terrible first language because it's impossible to actually understand what's going on even in a basic "hello world" program until halfway through the semester. So inevitably you have to start them off by saying "here's a bunch of noise that you don't understand, but that's OK, just copy and paste it" which is a terrible habit to reinforce.
You need to understand
* access modifiers (public/private)
* classes
* static methods
* void return values
* the main() function
* modules (System.out.println)
Just to understand a basic "hello world". It's too much for first-time programmers.
Actually, the hello world doesn't seem that bad to me. And I program mostly in Python nowadays.
class Test {
public static void main(String[] args) {
System.out.println("Hello world");
}
}
And then
javac Test.java
java Test
The class acts as a kind of namespace here, you don't need to go into OOP concepts for hello world at all. The beginner will have to understand
* Class is in file with the same name ¯\_(ツ)_/¯ and contains functions (maybe it'll help me later in organizing code?).
* Command line arguments (not strictly necessary concept but not too problematic either)
* Types - why is there String[] in the definition? (A must have in a statically typed language anyway)
* void - the function doesn't return anything (OK)
* public (not OK, some magic here)
* static (not OK, some magic here)
* Compilation vs. run step (a must have in a compiled language)
So IMHO there are just two concepts that could be thrown away. Or three, if you accessed CLI args from a library. But the meaning of public and static will come naturally when the students will learn about OOP.
I think you can omit the "public"s in the Java hello world (thereby making everything package protected, and sneakily sidestep the issue), but I don't want to ruin this machine by installing Java on it to make sure.
To be super thorough, you would also have to understand:
- semicolons (and the related issue of newlines being semantically equivalent to normal spaces)
- naming conventions (to explain why it's not "System.Out.Println" and "Main")
You can in C#, but not in Java. You can make it an enum, I think, to shorten all that. But my Java golfing knowledge is a bit rusty by now, admittedly.
You are right about first time programmers. But should people come to university without any prior programming knowledge?
If someone applying for an architecture degree or engineering degree is required to have basic drawing skills and basic math knowledge, why we shouldn't assume the same for CS?
Because computer science isn't about programming? That's exactly why a simple and elegant language should be chosen, one that allows to express the concepts that are taught. Java is such a horrible choice that it's laughable that it enjoyed this success at universities.
Yes, a simple language is better to teach CS. CS is not just about programming, but I think CS is about programming too and having a reasonable level of programming is useful.
If I was a teacher and I'm going to teach someone what a linked list is, or what a binary tree is, I can of course use natural language and drawing, I can use abstract algebraic concepts.
But the student should be also able to construct said structures and observe how they work. The same for any other concept.
Yes, and that's why a simple language is always preferrable to Java. It doesn't matter if a student has previous exposure to programming languages if this language can be taught in about two or three weeks worth of class.
Seems like your complaint is against using IDEs early on and not Java. It's been awhile since my undergrad, but we used Java and text editor. We compiled with javac and ran the jars from the CLI.
Likewise. It hasn't been that long since I finished my undergrad (<5 years), and if I remember correctly, we didn't start using Eclipse until second year, and even then it was considered 'optional', i.e. as long as your code compiles and passes the test cases, it doesn't matter whether you use Eclipse, vim, Notepad, etc.
The upshot of this was that I didn't have a clue what Maven was until I started working, but you can learn tooling on the job.
It wasn't until the final year of my 4 year Software Engineering degree I used an IDE for the first time (Eclipse, actually) and I was totally disoriented, not understanding how all the magic was happening.
I consider the IDE's harmful magic don't get me wrong, and perhaps you were taught better than I was, but you still have classes and related incantations out of the gate (I would prefer a non-imperative language as an ideal starter, but I'd settle for a procedural over an OO)
for very beginners, yes... Java is not the right programing language to teach. (I'd say Python is).
But for intermediate/advanced classes I think Java is a must learn for every CS graduate.
1. While it is bloated, it has so many modern and not so modern concepts into it, that learning it makes you understand those concepts better. (eg: It is hard to understand Generics if your favorite language doesn't have them)
2. A lot of enterprises (and major tech) backends still run on Java. I thin knowing it is a must to be competitive in the job market as a fresh grad
3. Learning Java (and probably disliking it) it will make you appreciate and perhaps make you better at learning other more nimble languages (aka. GoLang and friends)
CS programs aren't to let you make some easy money in the next few years. CS programs are to provide an thorough and high level of computing, from which writing code is just a small part. Having a CS degree much more than learning a language and framework.
If one's goal is just to earn some money in the next few years, he would be better served by taking an Udemy course.
University didn't teach me how to code, I did that myself before going to university. What it did teach was the why's and how's of computing without which I would be just a code monkey churning code without having a better understanding.
It is the same difference between someone who make an course learning how to repair electrical appliances and electronics and the engineer who designs them.
I'd like to stress on the second point. When I graduated at 2015, and have done multiple interviews with enterprises, Java or .NET were critical. Don't know if it's the same, but looking into job postings, I'd say it is.
It'd definitely different if you work for an IT company, but not always.
It’s kind of embarrassing but I didn’t understand generics from my beginner CS course (taught in Java). It wasn’t actively tested on either. I only began to appreciate it when I took an online MOOC in OCaml on a whim.
Type erasure gets an unfairly bad rap. Yes, it makes reflection difficult sometimes. Yes, it doesn't play well with value types. But other than that, erasure is a very powerful concept.
My understanding is that the only value in type erasure is that it maintains compatibility with libraries compiled in ancient versions of Java. Are there language-level benefits to type erasure?
Absolutely. Erasure lets you write code that's actually generic, instead of code that appears to be generic.
For example, in C# IList<Foo> and IList<Bar> are two different interfaces that happen to have similar methods. Whereas in Java, List<Foo> and List<Bar> are the same interface. This means you can do things like this in Java:
List<?> list = ...
Object o = list.get(0);
To do something similar in C# is significantly more effort. You either have to duplicate all of your interfaces (IList + IList<T>), use `dynamic`, or use reflection to compile delegates at runtime.
A common complaint is "but erasure lets you add an integer to a list of strings". But as long as you follow PECS[1] rules you can avoid most of those situations.
I don't think either of your points has anything to do with type erasure, and everything to do with allowing generics to take value types. This was an easy decision in Java, since it doesn't (yet?) allow user-defined value types. But C# has had `struct` since the beginning.
C# creates one instance of a generic for all reference types. The general consideration in instantiating multiple versions is object size. All reference types are the same size, so not a concern. Value types, however, range wildly in size and so generally get their own specialized versions during code generation.
The same choice effects the ability to wildcard. C# probably could implement wildcards over reference types, but it would feel inconsistent without value types. And, honestly, a good portion of wildcarding in my experience is to handwave away the compiler when you know what you're doing without reified types. Simply not an issue in C# -- its stronger guarantees around generic types means I can make that same code generic over the type I'm wildcarding in Java.
Code generation is definitely important to talk about, but that really wasn't the focus of my first example. Even if Foo and Bar were both reference types, the same reasoning would apply.
In C#, you can do:
class MyClass : IList<Foo>, IList<Bar> { ... }
In Java, you can't do:
class MyClass implements List<Foo>, List<Bar> { ... }
I know this is commonly viewed as an annoying restriction, but, IMO, it's rather an indication that you're writing code that doesn't respect the contract of the generic interface. For example, what should the `Count` property return if you're implementing IList<T> twice? (C# wiggles around this with explicit interface implementations, but I think it's fair to argue that that's not a strictly superior approach to Java's).
I really want to know what other languages large enterprises are using that have, or are on the track to getting, the kind of volume Java has. Especially languages that don't run on the JVM.
Here's an anecdote - Java almost made me drop out. My university was teaching Java and it was absolutely awful experience. Me with my web 1.0 html/css and turbo pascal experience was blown away how awful "professional" programming was.
Then I found this new kid on the block Python and it was a completely different story. I fell in love with programming and hacking in general and finally at the end of the study I knew enough of python to bootstrap myself with Java which I still think is an abomination.
When I become a student in 1998 we used C and C++ for CS courses. C is simple enough of a language and low level enough to understand what your code does when running on the hardware.
Fast forward to 2 years ago, I did MsC at the same University and most courses were done using Java. I guess that's due to the long arm of Sillicon Valley.
Although Java was used for teaching material, the teachers allowed assignments in your language of your choice if the problem permitted. I did most of the assignments in C#. One I did in Java because I needed a powerful search tool and C# version of Apache Lucene was old. Other people used Java for assignments, one guy was using Python.
I have nothing against Java or using Java for teaching. I think it's better to use Java than Python since it has strong typing, enables parallel computing and it's faster being a compiled language.
However, I dread Java and I avoid it as much as I can. It feels verbose, boilerplate code is too much, it lacks some features.
C# feels a lot cleaner. Even if it does have many libraries and enables you to target anything and do all kind of programming, it lacks the massive ecosystem of Java.
If someone would force me to choose between C++ and Java, I'd happily choose C++ if the task permits (i.e. I won't use C++ for web). And that says a lot since I am not very fond of solving the kind of bugs C++ almost guarantees you will have if the code is large enough. I'd rather solve C++ bugs than deal with the boilerplate and bloat and verbosity of Java.
That being said, I don't blame Java or people who designed Java and Java libraries. Allmost all old systems have their share of problems and Java is kind of old now.
Never programming languages and their ecosystems are solving some of Java's problems but if they will still exist in 20 years, we'll see they will bring their own baggage of problems.
I guess a perfect programming language is like an unicorn: we strive to catch it even if we know it doesn't exist.
What matters to me most these days is productivity, the speed something decent can be brought to market. By decent I mean something somewhat testable, somehow easy to understand in the future, somehow extendable and somehow maintainable.
There will always going to be trade-offs, making the right trade-offs based on the particular problem and resources is an art.
Not to mention those classes would make good programmer candidates start hate programming.
I can't imagine why people start teaching teens from C and Java and make them think programming is boring and annoying. Absolutely worse than not taking the class.
What do you ask teens to accomplish with C? And all the memory management and pointers? The worst way to enter programming. It needs to be taught with what is meant to be fun by doing it.
My undergrad was a while ago, but we also did assembly and prolog first. The lecturer basically said assembler was almost to shock us into seeing how preferably the magic was. I still don't understand why prolog.
I struggled with the Java, but got my teeth into it on my own time to the detriment of my marks in other modules. Eventually it clicked enough, I got a degree and walked into a career where my skills have been in demand for 15 years. I'm pretty happy with that, I'd go so far as to say Java was by far the best part of the course for me.
I was thought through Java but it was all command-line and jacvac.
I did once sort a much better programmers final year project for him though - years later that I worked with - cos he had a CLASSPATH issue. That did make me realise there are people thought the way you say that miss some of the basics.
I teach Java to first-semester students and I tried a couple of times to switch to another language because I think it’s not a good language to start with and because the students don’t like it either. However, the business (and we are a private university driven by businesses sending students) builds on two primary technologies: Java and COBOL. So even if I did teach another language, I would fail to prepare them for their careers in engineering.
What I do instead, though, talk about the architecture and backgrounds, show different ways and let them make the mistakes that help learn to solve problems.
I disagree with this as much as I dislike java I believe its a great language to teach someone object oriented programming. And you don't really need and ide at all, any text editor will do and will probably take away the complexity of learning the UI and help concentrate on what actual programming is. An IDE is really not needed to write your first algorithms as a student.
And you don't even begin to grasp what all those steps you just took in the IDE did until you're at least 1+ years in, and you don't learn what all of those magic words even remotely mean by at least halfway into your second semester.
I felt like every single thing I ever did was magic, never having a clue why it ever actually worked the way it did, and as a result nearly dropped out. The only time things _finally_ started to make sense is when I got to Computer Architecture/Organization with MIPS followed by some work in C.