Sinking among ifs
That's the general idea for exception handling. A typical example given to students is like this:
if(open_file()) {
if(read_file()) {
if(process_data()) {
show_result();
}
else {
error("Failed to process data");
}
}
else {
error("Failed to read file");
}
close_file();
}
else {
error("Failed to open file");
}
The lines in bold are "good code". Everything else is there for error handling. It seems very nice to write all "good" function call one after another and move error handling code somewhere else - welcome try-catch!
try {
open_file();
read_file();
process_data();
show_result();
close_file();
} catch(Exception e) {
// determine and error message here
}
Nice, we have separated the happy path from error handling code, now it's easy to understand what code does!The real life situations are not so nice...
There are different types of errors
- Disasters: something that generally shouldn't happen, like hard disk crash. Some errors are so rare and so fatal, that it's pointless to try preparing for them.
- Fatal errors: stuff that renders applications unusable, i.e. losing network connection is fatal for web application.
- Expected mistakes: user haven't filled required fields? Specified file name contains invalid characters? such types of errors are predictable and applications should be ready for them.
- Glitches: a string "15 " (trailing space) in 99% of cases is an integer number 15, dammit.
Opening file is so difficult
... So, we are opening a configuration file, that is not required to exist...
File *file = fopen(filename, "r");
Nice, NULL means it does not exist, otherwise it's something we can read!What's the problem, you can write it the opposite way:
FileStream file = null;
if(File.Exists(filename)) {
file = new FileStream(filename);
}
Does the same thing. Does it? Congratulations, you've just introduced full-moon bug! Files sometimes disappear, you know, get deleted. That can happen at any point in time, for example right between the existence check and opening... Fatal error, crash, or ... well, that file was never required to be there in the first place? So now code becomes:
FileStream file = null;
try {
file = new FileStream(filename);
}
catch (Exception e) {
// ignore
}
Wonderful, what used to be one line, now is... progress.
How badly you can blow?
C once again. You call a function and you expect it to return. Is this guaranteed? No! Application might die inside, but we don't care. longjmp() can be called, but we don't care again - unless we made it ourselves.Let's "upgrade" to C++. What can happen now? Yes, exception can be thrown, and there are types of them! Worse: new types of thrown exception can be added in the future!
It's considered a good practice to only catch exceptions you do care about and let other populate up the call stack. That's fine, but what about the new types of exception that might be added in the future? It looks like someone didn't design for future...
Exception safety
There is an amazing thing about exception safety I still can't explain. C++ is a language that with it's standard library throws something extremely rarely. A topic called "exception safety" is part of it's books. When we come to Java and co., where exceptions are thrown here, there and everywhere, this is somehow forgotten...
obj.foo(x, y);
You can only guess, how foo works with x and y, but there's one thing most seem to assume - all or nothing. If exception is thrown out of foo(), you want the state of obj unchanged! Simple concept, but not so easy to get it right.
Throw more exceptions and enjoy more full-moon bugs.
Exception specifications
This is something that pissed me off when I started learning Java. C++ has them too, but they are optional and no one seems to use them (except for standard library). Some even discourage it.Looking at C#, they have thrown away specifications entirely.
Looking back at Java... ArrayIndexOutOfBoundsException, PersistenceException and multiple others are "unchecked" exceptions so you don't need to write them all over the place. Are the two I mentioned so "unexcpected"?
Conclusions
- Exception handling works well with critical errors. Less serious the error is, less efficient exception handling is. For simple errors exceptions are more trouble.
- Exceptions are designed to separate useful code from error handling code. When exception handling mechanisms appear inside of a nested code blocks, it's a first sign of exception misuse.
- I also haven't mentioned, that exception are also expensive in terms of performance...
Many of the issues aren't very well-argued. Some of the arguments are just out-of-date.
AtsakytiPanaikintiThe lack of an atomic OpenOrCreate function is a library/OS deficiency, and creates exactly the same kind of problems regardless of whether exceptions are used.
Exception handling does **not** usually impose performance overhead anymore (on all 64bit compilers) unless exceptions do occur. If that's so frequent that it becomes a performance bottleneck, then yes, by all means redesign the code to not rely on exceptions for the common scenario.
The fact that the standard libray rarely throws is a conscious feature. Some embedded/real-time platforms do not afford the luxury of exceptions, and the larger part of the standard library can still be used on these platforms.