Well, when moving to a new architecture, there are three main impediments:
First is a compiler. ~Most~ code is all written at a high level. Even in Linux, pretty much all but the first stage of the bootstrap loader is written in some version of C and compiled to machine code. So if you have a compiler, most of that high level code will translate right over with only minor changes and optimization needed.
Second would be the endianness of the architectures. To start with, I'm assuming binary - if you had something else, trinary, quantum, etc... then that also falls into this category for similar reasons. Endianness is if the machine code (binary) is read right to left, or left to right (pertaining to memory access and data transfer). There is no standard, per say, apart from how you design the machine code to work. 011 would either be 3 or 6 depending on the endianness of the architecture. This can not only affect software, but hardware that plugs in (like via a PCI bus). If two different architectures are the same endianness, then largely it's just a matter of drivers in getting hardware support for everyday things. If not, then... it gets complicated. PPC was different than x86, and that's why back in the PPC days you needed special versions of video cards to work on Macs - any old off-the-shelf wouldn't work with a PPC mac.
x86 is "little endian". ARM is actually biendian, meaning you can program it to work either way, but it largely is operated in little endian mode. In little endian, 011 would be 3, not 6. Given that ARM and x86 can share the same endianness, getting most hardware intended for use on x86 to work on ARM should be fairly straight forward.
Lastly would be the "chicken and the egg" syndrome. You can't really use a new architecture unless there are things to do with it. That means it needs drivers, it needs software, and it needs an operating system of some sort. You can't get that until you've solved problems #1 and #2, and then you gain some level of critical mass of support from developers to start making things for your architecture and users to use that and spur the developers to keep pumping products into it.
If you take ARM - it took off because the world needed a low power processor that was "fast enough". It needed to be able to run on a battery for days and/or stuffed into a tiny plastic enclosure with no active cooling device (routers, TVs, etc.), and it needed to be fast enough to run a GUI and do some light level processing tasks. There really wasn't any real competition to this at the time - some of the the older TI and Motorola processors might fit the bill, but they didn't have the performance, and Intel didn't have anything as they were busy chasing high performance applications. It didn't ~have~ to be, but it certainly helped that it was an open platform, and you could tweak the design as needed for specific applications. ARM was very much a "right place at the right time" architecture. Now that it's past that Chicken and the Egg stage, it'll be around for a while, but anything competing with it has that uphill fight to get through to establish any market share.