Big Endian & Little Endian, What is the Difference?

In this post I will try to demonstrate the difference between Big Endian and Little Endian, or so called Endianness!

We know that everything in the computers memory is represented with a series of bits. I can have the sequence of bits: 0100 0001 for example, which has the decimal value of: 41.

In Java, the primitive type int is represented by 4 bytes = 32 bits. However Java interprets the 4 byte value as a signed integer. So an int is stored in memory something like this:
01000001 01000001 01000001 01001110
This has the decimal value of: 1094795598 whether signed or not is not a concern right now, since the first bit is 0.

Well we can say that the first bit is 0 because we used Big Endian implicitly. There are 4 bytes in the given flow of bytes and because we are used to reading from left to right, we assumed the first byte represents the most significant byte of the value.

Once you have more than one byte representing a numeric value, the order of the bytes become crucial.

To demonstrate this I will use the following code written in Java:
public class Test {
    public static void main(String[] args) throws Exception {
        File file = new File("/users/koraytugay/desktop/koray3.txt");
        int i = 1094795598;
        DataOutputStream outputStream = new DataOutputStream(new FileOutputStream(file));

Now I will open this file with my favorite Hex Viewer 0xED:

What I want to show here is that, the selected data, consisting of 4 bytes, will have a value of 1094795598 when it is interpreted as an 32 bit (signed or unsigned does not matter at this point) integer. But note that Big Endian is seen in the lower part of the screenshot. When it is changed to Little Endian the bytes will be read from right to left changing the value of the int completely!

Reading hex displays of numeric data in Big Endian systems is easy, because the digits appear in the order that Western people expect, with the most significant digits on the left. In Little Endian systems, everything is reversed; and the more bytes used to represent a number, the more confusing it can become.