Trade War with China

Status
Not open for further replies.

montyp165

Junior Member
Have you even done any work on AI? It has a tremendous barrier to entry. It requires good hardware and software and smart people. There are very few AI/microchip design experts and they're all in the US. If the US doesn't want Chinese students nor sell their tech, what can Chinese do besides throwing money at it?

AI will bring the cost of everything down and prevent having to outsource manufacturing and increase people's quality of life.

I think it's the right direction. Better than mass socialism.

If anything, my feeling is that AI would eventually enable the actualization of socialist goals that folks had dreamed about and envisioned but never reached.
 

manqiangrexue

Brigadier
Missing even one critical element would bring the whole ecosystem down.
That's true for all technology. Jets, missiles, rockets, supercomputers. 1 critical failure will bring the whole system down. If you're afraid of this, you really can't be in tech.
I am not saying they shouldnt do AI but should spend no more than 50% of their resources on that.
while remaining 50% should be on fundamental science stuffs .
LOLOLOL That's exactly, what you said before, that China shouldn't be doing AI because AI is a dead end and AI is like opium. Now you change to 50% focus on AI?? What does that mean, 50% opium usage is ok?? You clearly changed your indefensible stance just now. No more than 50%? Quite a random and arbitrary number, but sounds much better than shunning AI while the rest of the world races to master it. I think China is investing in a wide array and range of technologies. How would you go about quantifying this 50% of resources that you seem to think they exceeded? This needs many citations and calculations.

However, your newly amended opinion that China should divide its efforts between advanced sciences and fundamental sciences (like AI) is much more reasonable than before. I really like that idea; I like it so much that I actually came up with it several pages ago (in post #2474) in response to your post that China should shun AI and invest solely on fundamental sciences.
 
Last edited:

gelgoog

Brigadier
Registered Member
AI is kinda overrated. The Japanese Government decided in the 1980s that they were going to skip regular CPUs and go into so called 5th generation computer hardware (i.e. AI).
It was a bust. It is a fad that comes and goes as new techniques become popular and people get inflated expectations out of them.

You have to learn to walk before you learn to run.

The South Koreans were a lot more focused. Samsung saw that consumer devices were going to be more based on electronics in the future and decided to invest on chip manufacturing to pursue that goal. To build actual devices people would use and buy.

Once you go into abstract goalposts like "AI" as a vague buzzword you seldom get anything useful like what happened to the Japanese. What you get is a quagmire. "AI" is used in several important applications but it is a wide and diverse field so claiming you are investing into AI is not saying much at all. It is used in things like quality control in factories with machine vision algorithms. Text OCR. It is used in speech recognition and synthesis. It is also now becoming more used in applications like automated driving. Namely in obstacle detection, classification, avoidance, and path finding. So in that sense it is important today and in the future. All of these areas have been beaten to death in the past in the robotics field with slowly made progress which is typically measured in decades not years. IBM worked on speech recognition like five decades ago. They had practical implementations. But it was considered an unnecessary luxury in most cases so only much later did you see widespread use of it. Now every smartphone does it.

The USA Government at the same time as the Japanese Government also made investments into AI. The only program which can be called a success over that entire decade in USA govt funded research was when the US Army (not DARPA) invested into software to optimize military logistics. That was put into practical use and saved a considerable amount of resources to the US Army and was used later in other areas.

However if you do not even have the basic means to design and manufacture chips, computer volatile and non-volatile memory, or build decent quality automobiles, perhaps you should have other priorities than "AI". Preferably something you can measure and describe. Saying your goal is "AI" is kinda like doing a "War on Drugs" or a "War on Terrorism". These are nice catch phrases but how can you even define goals on something like that when it is so nebulously phrased? Compare that objective with "we will put a man on the moon by the end of this decade". Now that is well defined a political objective with a limited scope. It might be a waste of resources but at least it was clearly defined so you can track progress.

In most cases where people mention "AI" these are either software or hardware implementations of algorithms. But claiming it is a post Von Neumman architecture in the cases that people implement AI today, like on smartphone SoCs, is hogwash. There are proposals of alternative architectures. Like DARPA's research program with graph computing which could have applications in both AI and elsewhere. But that is not what you typically hear about. It could also be a bust in the long term just like other DARPA programs in the past like multi-flow architectures.
 
Last edited:

tidalwave

Senior Member
Registered Member
That's true for all technology. Jets, missiles, rockets, supercomputers. 1 critical failure will bring the whole system down. If you're afraid of this, you really can't be in tech.

LOLOLOL That's exactly, what you said before, that China shouldn't be doing AI because AI is a dead end and AI is like opium. Now you change to 50% focus on AI?? What does that mean, 50% opium usage is ok?? You clearly changed your indefensible stance just now. No more than 50%? Quite a random and arbitrary number, but sounds much better than shunning AI while the rest of the world races to master it. I think China is investing in a wide array and range of technologies. How would you go about quantifying this 50% of resources that you seem to think they exceeded? This needs many citations and calculations.

However, your newly amended opinion that China should divide its efforts between advanced sciences and fundamental sciences (like AI) is much more reasonable than before. I really like that idea; I like it so much that I actually came up with it several pages ago (in post #2474) in response to your post that China should shun AI and invest solely on fundamental sciences.

When did I say shunting AI? I said devoting every resource on AI is comparing to smoking Opium, being unrealistic.
 

manqiangrexue

Brigadier
When did I say shunting AI? I said devoting every resource on AI is comparing to smoking Opium, being unrealistic.
Every post; you called it opium 2 times. Usually that means to shun it. Did you mean that one should spend half his resources on opium? That's not what most people consider a healthy lifestyle LOL. You're trying to weasel out of what you said before.

China has never devoted EVERY resource to AI. It's ridiculous to think that. Now you say 50% is ok. So what percent are we at? Where are your sources and calculations? You seem to be making up random numbers without evidence and criticizing people for (supposedly) not following your imaginary numbers.
 
Last edited:

Hendrik_2000

Lieutenant General
AI is kinda overrated. The Japanese Government decided in the 1980s that they were going to skip regular CPUs and go into so called 5th generation computer hardware (i.e. AI).
It was a bust. It is a fad that comes and goes as new techniques become popular and people get inflated expectations out of them.

You have to learn to walk before you learn to run.

The South Koreans were a lot more focused. Samsung saw that consumer devices were going to be more based on electronics in the future and decided to invest on chip manufacturing to pursue that goal. To build actual devices people would use and buy.

Once you go into abstract goalposts like "AI" as a vague buzzword you seldom get anything useful like what happened to the Japanese. What you get is a quagmire. "AI" is used in several important applications but it is a wide and diverse field so claiming you are investing into AI is not saying much at all. It is used in things like quality control in factories with machine vision algorithms. Text OCR. It is used in speech recognition and synthesis. It is also now becoming more used in applications like automated driving. Namely in obstacle detection, classification, avoidance, and path finding. So in that sense it is important today and in the future. All of these areas have been beaten to death in the past in the robotics field with slowly made progress which is typically measured in decades not years. IBM worked on speech recognition like five decades ago. They had practical implementations. But it was considered an unnecessary luxury in most cases so only much later did you see widespread use of it. Now every smartphone does it.

The USA Government at the same time as the Japanese Government also made investments into AI. The only program which can be called a success over that entire decade in USA govt funded research was when the US Army (not DARPA) invested into software to optimize military logistics. That was put into practical use and saved a considerable amount of resources to the US Army and was used later in other areas.

However if you do not even have the basic means to design and manufacture chips, computer volatile and non-volatile memory, or build decent quality automobiles, perhaps you should have other priorities than "AI". Preferably something you can measure and describe. Saying your goal is "AI" is kinda like doing a "War on Drugs" or a "War on Terrorism". These are nice catch phrases but how can you even define goals on something like that when it is so nebulously phrased? Compare that objective with "we will put a man on the moon by the end of this decade". Now that is well defined a political objective with a limited scope. It might be a waste of resources but at least it was clearly defined so you can track progress.

In most cases where people mention "AI" these are either software or hardware implementations of algorithms. But claiming it is a post Von Neumman architecture in the cases that people implement AI today, like on smartphone SoCs, is hogwash. There are proposals of alternative architectures. Like DARPA's research program with graph computing which could have applications in both AI and elsewhere. But that is not what you typically hear about. It could also be a bust in the long term just like other DARPA programs in the past like multi-flow architectures.

The Japanese are a bit early but actually their goal is not that far off Nowadays there are alternative to CPU or sequential machine notice the proliferation of different kind of processor like GBU, FPGA, ASIC, NEURAL
The Von Neuman machine depend on sequential operation
Please, Log in or Register to view URLs content!


And at one time it will come across bottle neck sofar they have been able to overcome it by making the chip pack more transistor "Socalled Moore law" But Moore law is coming to an abrupt end since anything close to 5nm will cause interference and we are approaching that limit. Notice the failure of Intel in developing even 10 nm technology and the other FAB Global give up completely
Please, Log in or Register to view URLs content!


Death of Moore’s Law

In 1965, Dr. Gordon E. Moore wrote an article based on a trend he noticed: the number of transistors in an integrated circuit (IC) doubles approximately every two years[1]. Fueled by unrelenting demands from more complex software, faster games and broadband video, Moore’s Law has held true for over 50 years.[2] It became the de-facto roadmap against which the semiconductor industry drove its R&D and chip production, as facilitated by SEMATECH’s pre-competitive agreements among global semiconductor manufacturers. Recently, that roadmap has faltered due to physics limitations and the high cost-benefit economics incurred by the incredibly small scales that chip manufacturing has reached. Electron leakages and difficulties shaping matter at the single-digit nanometer scales of the transistors fundamentally limit further miniaturization. So many electrons are being moved though such tight spaces so quickly that there is an entire field in the semiconductor industry devoted just to chip cooling; without thermal controls, the ICs simply fry and fail. A new fabrication plant (fab) can cost more than $10 billion, severely limiting the number of companies able to produce denser ICs.

Despite the looming end of Moore’s Law, computationally-intensive artificial intelligence (AI) has exploded in capabilities in the last few years – but how, if compute is slowing down? The solution to exceeding compute limitations of traditional von Neumann style central processing units (CPUs)[3] has been to invent and leverage wholly new architectures not dependent on such linear designs[4].

A veritable zoo of compute architectures – including GPUs[5], ASICs[6], FPGAs[7], quantum computers, neuromorphic chips, nanomaterial-based chips, optical-based ICs, and even biochemical architectures - are being researched and/or implemented to better enable deep learning and other instantiations of AI. Here we review the latest / greatest[8] of non-CPU computer architectures relevant to AI. In each section, we describe the hardware, its impact to AI, and a selection of companies and teams active in its R&D and commercialization.

So yeah the Von Neuman machine architecture is drawing to a close.

US and the west has commanding lead on the Von Neuman architecture due to the legacy. Because the semiconductor was invented in the US due to the work of Kirby from Texas Instrument and Noyce from Fairchild who latter founded the Intel in 1967
Please, Log in or Register to view URLs content!


Now what is China doing in in 67 Cultural revolution wrecking all the industrial and research institution in China and the University did not open its door until 1980's
So your comparison to South Korea is faulty . South Korea had early start and uninterrupted industrial development since 1960's. And she has complete access to American technology and research and development
China was put on the technical embargo list first there is Cocom then Wessenar Now ITER

Another thing China was VERY late in semiconductor business. She spend the first 20 years after CR end just building the basic industrial base starting with pot and pan, clothing and plastic toy
Look it this way Huawei the premier Chinese high tech was only founded in 1987 and spend the first 10 year importing phone switches latter making it themselves So it was not until 2000 she start making their own switches
Please, Log in or Register to view URLs content!


So basically of course the business volume was low in the early years since they only catering to domestic demand It was not until 2005 Huawei start exporting phone switches
As I said before the High tech industry DOES NOT create large demand for semiconductor until the last 10 years
So of course if there is no large why built large and expensive semiconductor It make more sense to import it

Now that there demand the government realize the vulnerability of high tech industry in China and start the program of building semi conductor FAB since none of the private industry has the capital, R&D effort etc
But it take time since even building the factory requiring exact construction
Concurrently the Semi conductor tool industry did not start in earnest until 2010

But now that everybody realized the strategic importance of semiconductor they are building FAB like there is no tomorrow. China has program to develop all the range of processor from GBU to ASIC. Some already has prototype and will commence production soon

And any effort to stymied it will come to naught since the technology has been diffused. Generation of Chinese scientist has learned the technology in the west . The problem is not enough of them and training will take some time
Right now there are plethora of Chinese semiconductor tool fabricator in China they might not have the latest technology but good enough for most application
 
Last edited:

shifty_ginosaji

Just Hatched
Registered Member
@Hendrik_2000 @gelgoog

AI is still a developing field. Blanket statements beyond the obvious (such as pointing out physics has compelled an end to the von Neumann architecture) ignore the fact
Please, Log in or Register to view URLs content!
. Ultimately different companies, countries and cultures will try different approaches and there is no consensus on what is the dominant architecture.

I don't think the military has led this, as the market is really skewed towards information aggregators and advertisers (read: Google and Amazon) compared to defence contractors. A more important topic to discuss might be how China can leverage it's 'private' sector, whereas the US sees intense resistance from many parties whenever it tries to leverage it's intellectual capital.
 

Hendrik_2000

Lieutenant General
@Hendrik_2000 @gelgoog

AI is still a developing field. Blanket statements beyond the obvious (such as pointing out physics has compelled an end to the von Neumann architecture) ignore the fact
Please, Log in or Register to view URLs content!
. Ultimately different companies, countries and cultures will try different approaches and there is no consensus on what is the dominant architecture.

I don't think the military has led this, as the market is really skewed towards information aggregators and advertisers (read: Google and Amazon) compared to defence contractors. A more important topic to discuss might be how China can leverage it's 'private' sector, whereas the US sees intense resistance from many parties whenever it tries to leverage it's intellectual capital.

There might be different route to AI technology but CPU or linear machine is not one of them Simply because AI
required speed and massive computation which CPU machine can't provide as it will take too long
Because you can't run parallel computation everything has to go sequentially!
read this
Please, Log in or Register to view URLs content!
 
Status
Not open for further replies.
Top