I have a degree in mathematics and computer science, but still I don't know why different programming languages evaluate Integer/Integer
differently. On any calculator if you divide one integer with another you get a decimal number (of course not if the numerator is a factor of the denominator).
Python for example returns a integer number:
>>> print 3/10
0
>>> print 3/10.0
0.3
Perl on the other hand returns a decimal number:
print 3/10; print "\n";
print 3/10.0;
...gives...
0.3
0.3
PostgreSQL is like Python:
database=# SELECT 3/10 AS quotient1, 3/10.0 As quotient2;
quotient1 | quotient2
-----------+------------------------
0 | 0.30000000000000000000
And good old MS Excel gives:
=3/10 <=> 0.3
Why is it so?