Chinese OS and software ecosystem

tphuang

Lieutenant General
Staff member
Super Moderator
VIP Professional
Registered Member
Please, Log in or Register to view URLs content!
Loongson sure is passionate about this domestication drive

They are developing their own ERP platform with just Loongson chips, Loongson cloud and such. Not sure who is going to use this, but almost all of their own computers and systems have moved to use Loongson CPUs.

They are also working with domestic EDA tools to work on Loongson platform
 

BlackWindMnt

Captain
Registered Member
Please, Log in or Register to view URLs content!
free pascal compiler now supports Loongarch. Since I have never used Pascal in all the years I've programmed, I can only assume this is not a big deal
I know some old developers that programmed in pascal with the turbo pascal variant from Borland. They said good things about the language. Especially the fast compilation time helping with code iteration speed.

But in 2023 a more interesting language support would be from Rust which has big/major backing from Microsoft, Amazon, Huawei etc.

Github on going ticket regarding LoongsonArch support:
Please, Log in or Register to view URLs content!
 

tphuang

Lieutenant General
Staff member
Super Moderator
VIP Professional
Registered Member
From the infamous Huawei Whisper

基于开源鸿蒙系统而来的KaihongOS也开始推出来了

这个系统由深圳开鸿基于开源鸿蒙系统研发而来,目前开始在天津港使用。我还是那句话,很多人认知的鸿蒙系统,并不一定都是华为那种鸿蒙,开源鸿蒙就像开源安卓一般,任何厂商都可以自己基于开源获取属于自己的系统或者UI。

所以理论上基于开源鸿蒙系统的,可以起任何名字,不一定都叫HarmonyOS,也可以叫别的。
Kaihongos is out. It's a OS based on openHarmony OS. In the future, there will be many harmonyOS forks just left open android forks.
 

tphuang

Lieutenant General
Staff member
Super Moderator
VIP Professional
Registered Member
Please, Log in or Register to view URLs content!

Huawei is about to release GaussDB, China's first full-stack independent database with software and hardware collaboration. This DB has already been talked about a lot when looking at the internal stack for HW. So now they are looking to sell this distributed DB. I think it will attract a lot of attentions from SOEs, govts and such.
 

vincent

Grumpy Old Man
Staff member
Moderator - World Affairs
Please, Log in or Register to view URLs content!

Huawei is about to release GaussDB, China's first full-stack independent database with software and hardware collaboration. This DB has already been talked about a lot when looking at the internal stack for HW. So now they are looking to sell this distributed DB. I think it will attract a lot of attentions from SOEs, govts and such.
Changing DB is a painful process. Large and complex ETL projects can take years.
 

tokenanalyst

Brigadier
Registered Member
, Huawei will release a 100-billion-level large-scale model product called "Pangu Chat" that directly targets ChatGPT!


According to Huawei's internal sources, it is expected that Huawei Pangu Chat will be released and internally tested at the Huawei Cloud Developer Conference held on July 7 this year. The products are mainly for government and enterprise customers.

This means that in the "arms race" of domestic large-scale models, after Ali and Baidu, another important technology giant has entered the game.

It is reported that the Pangu model was successfully established in Huawei Cloud in November 2020. For the positioning of the Pangu model, Huawei's internal team has established three key core design principles: first, the model must be large enough to absorb massive amounts of data; The generalization ability can truly be applied to work scenarios in all walks of life.

According to a paper released by Huawei, the parameters of Huawei's Pangu-Σ large model are at most 1.085 trillion, which is developed based on Huawei's self-developed MindSpore framework. Overall, the PanGu-Σ large model may be close to the level of GPT-3.5 in terms of dialogue.

MBXY-CR-74ba45de482f479cac655b478cffc3d5.png


According to public information, the Huawei Pangu model was officially released in April 2021, and was later upgraded to version 2.0 in April 2022. At present, the NLP large model, CV large model, and scientific computing large model (meteorological large model) in the AI large model have been marked as coming online.

According to reports, this is the first Chinese pre-training large model with 100 billion parameters, and the CV large model has reached 3 billion parameters for the first time. Pangu CV Large Model The largest CV large model in the industry, it is the first to achieve both discrimination and generation capabilities, and it is the industry's first in the small sample learning ability on ImageNet; the Pangu meteorological large model provides second-level weather forecasts; Zidong Taichu is the world's first map , text, and audio three-modal large model.

For the positioning of the Pangu model, Huawei's internal team has established three key core design principles: first, the model must be large enough to absorb massive amounts of data; The generalization ability can truly be applied to work scenarios in all walks of life.

According to the PPT information of Huawei Cloud executives' speeches, at present, the basic layer of Huawei's "Pangu series AI large model" mainly includes NLP large model, CV large model, and scientific computing large model, etc., and the upper layer is Huawei's industry large model developed with partners .

MBXY-CR-184037f41fb33335c1db2b742dba673d.png
The Huawei Cloud official website shows that the Pangu large model is composed of multiple large models such as NLP large model, CV large model, multi-modal large model, and scientific computing large model. AI scale, industrialization problems, can support a variety of natural language processing tasks, including text generation, text classification, question answering system and so on.

Specifically, the Pangu NLP large model uses the Encoder-Decoder architecture for the first time, taking into account the comprehension and generation capabilities of the NLP large model, ensuring the flexibility of embedding the model in different systems. In downstream applications, only a small number of samples and learnable parameters are needed to complete the rapid fine-tuning and downstream adaptation of a large-scale model of 100 billion. This model has a good performance in intelligent public opinion and intelligent marketing.


Please, Log in or Register to view URLs content!
 
Top