Why would Hollywood want to piss off a bunch of people just to push their own political agenda?
That's because you're seeing this through the American perspective alone, and you're ignoring the fact that despite involvement from American studios, the Bond series is a British one, not an Hollywood franchise.
In the British context, it's not that obvious it would be a political statement at all. British TV is full of great, non stereotypical characters played by black people. The whole social context is totally different.