[And I'm STILL not seeing your posts. <frown> Something must
be broken in my server. Yet the telephone system seems to be
working properly! Yet another thing to look into...]
Perhaps Grant could re-post for me here?
Maybe you killfiled me?
I think you are using news.eternal-september.org for your server, which
is the same as me.
Post by Grant EdwardsPost by David BrownSome vendor-supplied toolchains are not bad, but some are definitely
subpar - and often many years behind the versions you get from
manufacturer independent suppliers (like ARM's build of a gcc toolchain,
or commercial gcc toolchains). The biggest problem with microcontroller
manufacturer's tools is usually the SDK's that are frequently horrible
in all sorts of ways.
I think vendors offer tools for much the same reason as they
write app notes or illustrate "typical applications". Their
goal is to get you USING their product, as quickly as
possible. If you had to hunt for development tools, then
you would likely bias your device selection based on the
availability and cost of said tools.
Yes.
Sometimes I think they could put more effort into making sure you want
to /keep/ using their products!
[I worked with a firm that offered a "development system"
(compile/ASM/debug
suite plus hardware ICE, in the days before JTAG and on-chip debug
existed) for an obscure, old processor. They recounted a story where
a customer purchased said tool -- for a fair bit of money. And, promptly
RETURNED it with a *nasty* note complaining about the (low) quality
of the fabrication! Some time later, they REordered the system...
when they discovered there were no other offerings in that market!]
That's it - you don't have to be good to conquer a market, you just have
to be the best available.
Note the inroads Microchip made (esp with hobbyists) by offering
free/low cost tools for their devices. I suspect their devices
were not the ideal choices for those applications -- but, the value
of HAVING the tools without spending kilobucks to buy them weighed
heavily in their decisions!
The success of Microchip here has always confused me. Their hardware
development tools were not good or particularly cheap when I used them
in the late 1990's. Their software development tools were terrible. I
remember IDE's that wouldn't work on newer PC's, only a fairly basic
assembler with very limited debug support, and C compilers that were
extraordinarily expensive, full of bugs, incompatibilities and crippling
limitations.
They did have a few things going for them - the PIC microcontrollers
were very robust, came in hobby-friendly packages, and never went out of
production. But their tools, beyond the lowest-level basics, were
expensive and very poor quality, even compared to the competition at the
time. They still are - I don't know any other manufacturer that still
charges for toolchains. And their prices are obscene - $1.6k per year
to enable optimisation on their packaging of gcc-based toolchains for
ARM and MIPs. The only effort Microchip have made to justify the price
is all their work in trying to make it look like they wrote the compiler
and it's not just gcc with added licensing locks.
Post by Grant EdwardsPost by David BrownBut I agree with your advice - where possible, use ARM's gcc toolchain
build for ARM development. And make sure your project build is
independent of any IDE, whether it is from the vendor or independent.
IDE's are great for coding, and vendor-supplied IDE's can give good
debugging tools, but you want the project build itself to be
independent.
This is a given, regardless. "Sole source" suppliers are always a risk.
Moving from one ARM to another (vs a totally different architecture)
saves a lot -- but, you are still stuck with the choices the fab made
in what they decided to offer.
Post by Grant EdwardsWhat he said, defintely: Avoid vendor-specific IDEs and SDKs like the
plague.
Demo apps and libraries from Silicon vendors are usually awful -- even
worse than the toolchains. I'm pretty sure they're written by interns
who think that to be "professional" it has to incorporate layers and
layers of macros and objects and abstrcation and polymorphism and
whatnot.
The same is often true of "app notes" for hardware components.
I interviewed a prospective client about a project. Somewhere
along the line he "proudly" presented his proposed "solution"
(then, what do you need ME for?).
I looked at it (schematic) and replied: "This won't work."
I.e., there were signals with no driving sources! He then
admitted to copying it from an app note (said reproduction
later verified to have been accurate; the app note was in error!)
But, you aren't likely going to RUN those apps. And, libraries
can be rebuilt and redesigned. So, you aren't at their "mercy"
for those things.
OTOH, if the vendor has some knowledge of a device defect (that
they aren't eager to publicize -- "trade secrets") but their
tools are aware of it and SILENTLY work-around it, then they
have a leg up on a third-party who is trying to DEDUCE how
the device works from the PUBLISHED knowledge (and personal
observations).
Call me naïve, but I don't see this happening much. The manufacturers
we use for microcontrollers tend to be quite open about defects and
workarounds, as are ARM (and as I wrote earlier, for the Cortex-M
devices the cores come ready-made from ARM, with a lot less scope for
vendor-specific bugs). A vendor that tried to hide a known defect in an
ARM core would suffer - they would get caught, and getting on the bad
side of ARM is not worth it.
But it is certainly possible that a vendor supplied toolchain build will
have the workaround before it has made it into other toolchains. Hiding
defects or lying about them is bad - but being first to have a fix is fine.
E.g., I would be happier knowing that a compiler would avoid
generating code that could tickle vulnerabilities in a
system (e.g., memory) over one that blindly strives for
performance (or ignorance).
Sure. But you are worrying about nothing, I think. Have you ever seen
a compiler where you know the developers /intentionally/ ignored flaws
or incorrect code generation?
Or, a compiler that knows enough about the SPECIFIC processor
(not just the FAMILY) to know how to more finely optimize its
scheduling of instructions.
That's a good reason for using compiler builds from ARM, rather than
microcontroller vendors - they know the cpus better. Target-specific
optimisations are always passed on to mainline gcc (and clang/llvm), but
can be in ARM's builds earlier. A vendor that builds its own toolchains
might pull in these patches too, but they might not. (Most vendors of
ARM microcontrollers use the compiler builds from ARM, but these might
be a bit dated.)
[The days of expecting the code to just implement the expressed
algorithm are long past. "What ELSE are you going to do FOR me?"]
Post by Grant EdwardsAs a result I rememeber failing to get a vendor's "hello world" demo
to run on a Cortex-M0+ because it was too large for both the flash and
RAM available on the lower end of the family. And it wsan't even
using "printf" just a "low level" serial port driver that should have
been a few hundred bytes of code but was actually something like
10KB..
But, if they published that code, you could inspect it, determine
what it was TRYING to do and fix it (or, take a lesson from it).
It's amusing when these sorts of things are treated as "proprietary
information" ("secret"). There's nothing revolutionary in a GENERIC
standard library implementation (though there are varying degrees
of performance that can be obtained from those that are SPECIALIZED)