With the internet now an indispensable part of daily life, migrating to the cloud has become a primary strategy for businesses moving toward digital transformation. From multinational corporations to small and medium-sized enterprises, many businesses are adopting public or hybrid cloud solutions, moving local storage to the cloud. This has been a major driver in the rapid growth of cloud data storage services, with major cloud service providers worldwide continuing to expand their cloud infrastructure to seize opportunities. With many small and medium-sized enterprises also needing to establish cloud centers, this is expected to continue driving significant demand for cloud services. Cloud servers must have excellent and stable operational performance to meet their high computational demand. The components within the server chassis, such as the CPU, GPU, motherboard, cooling system, and power supply, must be perfectly integrated. Leveraging its robust electromechanical integration capabilities, Chenbro adopts three modes for providing various services to end customers such as cloud service providers: OTS standard products, JDM/ODM, and OEM+. Solutions encompass modular, multi-configurable, highly compatible, and highly distinctive systems with the highest grade of electromechanical integration, catering to the diverse needs of different cloud data centers. In response to the global wave of AI driving strong market demand for high-performance server platforms from data centers and enterprise users, Chenbro offers a new generation of Intel and AMD platform server chassis solutions. These solutions are designed based on Intel and AMD platform specifications to ensure the most suitable airflow channels. This is a particularly critical consideration given that newer cloud application servers equipped with Gen 5 have transmission rates that double those of Gen 4 enterprise-grade servers. With various hard drive configuration options, customers can flexibly choose a solution that suits their specific needs. Chenbro's intricate and rigorous design process for cloud server chassis provides high stability and reliability for enterprises and cloud users. Examples include the RB151/RB251 T-Shaped series, pre-equipped with Intel motherboards as quasi-systems suitable for mainstream storage, and the RM151/RM251 L-Shaped series, compatible with both Intel Eagle Stream and AMD Genoa platforms, offering diverse hard drive configurations to meet different application needs.
In the era of massive data, the value of data has increased significantly with the dominance of cloud services and the growing number of services that are data-driven. As a result, enterprises are placing greater emphasis on data storage and analytics, leading to a surge in demand for extensive data storage solutions. According to a report from IDC, the global total data storage capacity has grown remarkably in recent years, from 33ZB in 2018 to an estimated 175ZB by 2025. Moreover, the diversity of data that enterprises collect today is increasing, requiring storage servers with higher-density storage structures. Demand is also growing for servers with efficient cooling systems and high-performance computing capabilities. To meet the elevated requirements of storage servers, Chenbro places its customers’ needs at the core of its product design to develop create unique, competitive, and more efficient solutions that are based on the latest market technology and design trends. For example, high-speed computing is critical for the processors found in modern servers, resulting in higher temperatures during operation compared to previous server designs. Chenbro’s focus on cooling mechanisms during product design ensures operational stability. Our innovative, patented high-density storage server chassis solutions, such as the RM25324 2U Dual-load Series and RM43736 4U Tri-load Series, are prime examples of the outstanding thermal efficiency that our solutions deliver. Chenbro's current range of storage servers, from entry-level to high-performance variants, boasts user-friendly designs and convenient maintenance features. Their high flexibility and diversity caters to various storage applications, including cold and hot data storage, secure monitoring, databases, and archive servers.
Internet of Things (IoT) technology has become established as a staple technology in smart manufacturing, smart healthcare, smart retail, smart transportation, and more. With the number of connected devices globally continuing to increase each year, the international research firm Analytics forecasts that the global IoT device count will reach 386 billion by 2025, and this is expected to reach 500 billion by 2030. With billions of connected devices around the world collecting data and connecting to the internet, demand is growing for faster, low-latency 5G networks. Many enterprises are opting for an edge computing network infrastructure to address potential issues such as network bandwidth shortages, which can cause unstable or interrupted data transmission. This involves deploying device-level edge servers to perform computations and store on device data. This approach reduces network bandwidth usage, minimizes latency, and shortens response times, enabling businesses to operate more efficiently and provide higher-quality services. Recognizing the sharp growth in demand for edge computing servers, Chenbro continues to innovate and tailor its products to meet 5G edge computing requirements. This includes diverse solutions for O-RAN and edge telecommunications base station equipment, meeting the needs of high-availability network operators and assisting them in creating more agile communication services. For instance, Chenbro's RM252 and RM352 series are optimized for 5G and edge computing applications. The RM252 series supports up to three full-height or seven thin PCIe expansion slots in a single GPU, maximizing computing performance for distributed or open IT infrastructure edge sites. The RM352 series supports full-height PCIe expansion slots for GPU or FPGA accelerator cards, optimizing edge computing, machine learning, and base station applications at edge sites.
The generative AI wave sparked by ChatGPT continues to ferment. The Top 10 Strategic Technology Trends for 2024 released by Gartner indicate that generative AI will bring new possibilities, enabling humans to accomplish previously impossible tasks. Following this trend, AI-related technologies and products are gaining increasing attention, especially AI servers supporting large-scale data processing, which is becoming increasingly crucial. Due to the substantial computational resources and data storage space required by AI training models to perform rapid and efficient inferences, AI servers must be equipped with at least 6 to 8 GPU processors to support the training of large models and processing massive amounts of data. With expanding memory capacity, the design structure of AI server chassis must also be upgraded to better integrate server components and accommodate more elements in a limited space. Chenbro’s SR113 and SR115 upright convertible rack-mountable 4U server chassis are specifically designed for AI inference and deep-learning GPGPU servers, supporting up to 5 GPGPU cards. AI inference servers can utilize efficient computation to meet large-scale inference requirements. The SR115LCooling model is fitted with a water-cooling module, providing excellent heat dissipation that has been verified through testing to provide strong hardware support and a fully integrated AI inference server chassis.