Acer’s industrial server solutions subsidiary Altos Computing has launched the new BrainSphere T15 F6 server alongside the Altos aiWorks 4.0 AI Computing Platform.
We’ll talk about the hardware first which is the BrainSphere T15 F6 server positioned as an entry-level system for SMEs to power their in-house edge computing requirements as well as private and hybrid cloud server needs.
Powered by an AM5-based motherboard with EPYC 4044 series processors, it features 3D V-Cache support for expanded L3 cache capacity as well as DDR5 high-speed RAM together with PCIe 5.0 slots for all the cutting-edge compute cards. Storage-wise, M.2 slots can be populated to fill up to 32TB of capacity as well as USB 3.2 ports.
It also comes with a DASH standard Realtek network controller instead of the conventional BMC so that system admins can collect and access various monitoring features like device power states, health, system log records, and more via a remote connection. In short, the system is an industrial class, business-ready solution ideal to serve a wide range of industries.
On the other hand, the AI computing platform mentioned a while ago, aiWorks is now live with its version 4.0 update which adds integration of software stacks powered by NVIDIA NeMo as well as NVIDIA L40S GPUs, leading to an expanded ecosystem that allows more institutions like research teams to have more access to the power of Gen-AI.
The move also aimed to develop a more mature computing platform to nurture upcoming talents by embracing the very best hardware and software solutions (particularly from NVIDIA, everybody knows why). Specifically, it allows Altos servers, workstations, GPUs, as well as in-house accelerators to work seamlessly and be able to scale without much issue to tackle the growing needs of computational power as well as task allocation management.
Not only facilitating the most crucial stages of AI model training and inferencing, Altos also did some major user interface optimization for better and more concise navigation in addition to maintaining all advantages and features released in previous generations. Key updates also include things like a multi-process service-sharing strategy that provides higher flexibility and more adaptable AI compute resource allocation.