- Number of Unauthorized Cobalt Strike Copies Plummets 80%
- Is your AirTag battery dying too quickly? There's (likely) a fix for that
- José Manuel Rodríguez Jiménez, el español al frente de la transformación digital de la ciudad de Gresham
- 레노버 태블릿, 국내 안드로이드 시장 외산 브랜드 1위··· 삼성전자 이어 점유율 2위 차지
- MS, 오픈AI 경쟁할 자체 추론 모델 개발 난항··· 기술적 한계 외에도 인재 이탈설 나와
Lenovo introduces entry-level, liquid cooled AI edge server

Lenovo has announced the ThinkEdge SE100, an entry-level AI inferencing server, designed to make edge AI affordable for enterprises as well as small and medium-sized businesses.
AI systems are not normally associated with being small and compact; they’re big, decked out servers with lots of memory, GPUs, and CPUs. But the server is for inferencing, which is the less compute intensive portion of AI processing, Lenovo stated. GPUs are considered overkill for inferencing and there are multiple startups making small PC cards with inferencing chip on them instead of the more power-hungry CPU and GPU.
This design brings AI to the data rather than the other way around. Instead of sending the data to the cloud or data center to be processed, edge computing uses devices located at the data source, reducing latency and the amount of data being sent up to the cloud for processing, Lenovo stated.