Optimizing Server Power Consumption
페이지 정보

본문
One of the essential tools for achieving this goal is the concept of server single-rail derating factor, or sometimes input limitation. In this article, we will delve into the world of derating factors, aiming to clarify their role in powersupply selection and data center cabling design.
First, let us examine the practice. In electrical engineering, the term refers to the practice of reducing maximum operational output levels for a device below their rated limits, in order to guarantee reliability, اس اس آر and prevent other reliability issues. Server PSU manufacturers often incorporate derating into the design as a failsafe measure, allowing the devices to operate within safe temperature ranges, guaranteeing the performance and preventing any potential infrastructure failure.
The derating factor is the ratio of maximum rated power to the actual output capacity, in some cases expressed in decimal for easy visibility and comprehensibility. Derating can be broken down into three types:
- -Input line sequence voltage variation derating factor
- -Standard output derating
- -Optional internal deratic curves derating
- 이전글Understanding SSR Blocking 25.05.16
- 다음글Your Online Personal Ad- Write For Achievement! 25.05.16
댓글목록
등록된 댓글이 없습니다.