Connect with us

Windows

Home heating, possible use of excess heat from data centers in Europe

Avatar of Thomas Grimm

Published

on

datacenter 1000x562 jpg

In Europe, concerns about polluting gas emissions are added to those caused by the price of the necessary energy, among other things, to heat the houses of its citizens. But several countries on the continent have already started looking for alternatives to existing Technologys. For example, according to Techradar, they are studying how take advantage of excess heat from data centerswhose presence in the region is more than notable, and which is increasing, given the construction of new facilities.

These data centers, mostly owned by big tech companies, consume vast amounts of energy to keep their servers and computers cool. The massive amount of heat they produce as a result of having to store more and more data and run more tasks is typically dissipated through high-powered air conditioning systems or cooling towers. This leads to all the waste heat being wasted, something many data centers are already remedying.

Thus, more and more data centers are using this residual heat to heat buildings and homes. A) Yes, in Denmark, Meta it takes a while recovering excess heat from its Odense data center. It started doing it in 2020, and hopes to be able to heat the equivalent of 11,000 homes with it in 2023. On the other hand, both Microsoft and Apple and Amazon have begun to draw up plans to carry out similar actions. Alphabet is also committed to exploring opportunities related to the use of this excess heat for heating.

Meanwhile, there are already ten data centers in the Netherlands connected to district heating systems, distributing excess heat to nearby homes and buildings. These are not going to be the last, since there are another 15 in the country that have plans to continue their walks.

Using data centers to heat homes has many advantages. To begin with, it reduces the demand for fossil fuels, widely used for heating in many European regions. It also has the potential to help reduce emissions of polluting gases. This is because in many cases data centers are powered by renewable energy sources, such as wind and solar.

In France and Denmark, both municipal and national governments have reportedly approved tax incentives for the smart use of waste heat, and some building permits require the building for which they are granted to capture the excess heat they generate. .

In addition to heating homes, data centers are also being used to heat greenhouses, which allow farmers to grow their crops year-round. According to some experts, a 180 kilowatt data center would be capable of heating up to 5,000 square meters of greenhouse space in winter. It is enough to produce 250 tons of tomatoes.

Windows

NASA and IBM will develop foundational AI models to study the impact of climate change

Avatar of Thomas Grimm

Published

on

nasa ibm modelos fundacionales ia ciencias cambio climatico

IBM and NASAspecific your Marshall Space Flight Centergoing to to collaborate in the development of foundational Artificial Intelligence models with which they will discover new elements and concepts among the enormous volume of geospatial and Earth scientific data held by NASA. The joint work, which will use IBM’s AI, will for the first time apply foundational AI model technology to NASA’s Earth-observing satellite data.

A foundational model is a type of Artificial Intelligence model that is trained on a very large set of labeled data. They can be used in different tasks, in addition to applying the information collected in a specific situation to a different one. They are mostly found in work being done in natural language processing (NLP) technology, an area in which they have seen a notable takeoff in the last five years. IBM, for its part, is one of the pioneering companies in the use of these models in areas other than language.

This partnership between IBM and NASA is intended to offer researchers on Earth-related topics a simpler system to analyze and extract information from the data they have about her. With IBM’s foundational modeling technology they can develop both their analysis and their discovery. This allows them to advance more quickly in the scientific understanding of the Earth and what happens on it, which leads to better answers to problems related to climate change.

IBM and NASA have plans to develop new Technologys to extract information from Earth observations thanks to their joint work. For starters, one project will train an IBM geospatial intelligence model using the Harmonized Landstat Sentinel-2 (HLS) dataset, on a record of the land surface and the land-use changes they capture. satellites in the orbit of the planet.

By analyzing the vast amounts of collected satellite data to identify changes in the geographic footprint caused by events such as natural disasters and crop yields, this foundational model will help researchers to have a critical analysis of systems. Earth’s environments.

The collaboration between both entities is also expected to create a corpus of literature on Earth sciences, which can be used to easily search for information. IBM has created a PLN model trained on some 300,000 earth science journal articles to classify earth science literature and facilitate knowledge discovery.

The model has one of the largest AI workloads trained with Red Hat OpenShift to date. This fully trained model uses PrimeQA, a multi-language question and answer system from IBM, developed using open source. And in addition to the study of Earth sciences, and to understand the effects of climate change and help stop it, it could be integrated into NASA’s scientific data management processes.

Another project that will likely come out of this deal is the development of a foundational weather and climate prediction model based on the MERRA2 dataset. This project is part of the NASA Open Source Initiative.

Raghu Ganti, Principal Investigator at IBMhas pointed out that “foundational models have proven successful in natural language processing, and it is time to expand it into new domains and modalities important to business and society. Applying foundational models to geospatial, sequence of events, time series, and other non-linguistic factors within Earth science data could make enormously valuable insights and information available to a much broader group of researchers, companies and citizens. Ultimately, it could help a larger number of people working on some of our most pressing climate issues.«.

For his part, Rahul Ramachandran, Principal Investigator, NASA Marshal Space Flight Centerhas recalled that «the beauty of foundation models is that they can potentially be used for many downstream applications“, although acknowledging that”the construction of these models cannot be tackled by small teams. Teams in different organizations are needed to bring their different perspectives, resources, and skill sets to the table.«.

Continue Reading

Windows

OpenAI announces ChatGPT Plus, a paid version of its AI chatbot, for $20 per month

Avatar of Thomas Grimm

Published

on

openai chatgpt plus 20 dolares

Just a few days ago OpenAI announced that it was preparing a paid version of ChatGPT. And it’s ready: ChatGPT Plus. with her, for 20 dollars a monthits users will have priority access to this chatbot, even during peak hours of use of this Artificial Intelligence text generation system.

Users who do not pay to access ChatGPT Plus will therefore have to wait for paid users to let it go. The plan also promises to offer faster response times, as well as priority access to new features and improvements that OpenAI releases for the chatbot in the future.

For now, OpenAI will send messages to start subscriptions to ChatGPT Plus to users who are in the United States and are signed up to the waiting list of those who want to subscribe to the ChatGPT payment plan in the coming weeks. For now there are no deadlines or dates for the expansion of the service to other countries, and OpenAI has limited itself to saying in this regard that it will open the plan to other countries and regions in the future.

Just a few days ago there was speculation about the price that the monthly subscription to ChatGPT Plus would have. According to various rumors it was going to be much higher, and that it was going to cost up to 42 dollars a month, which would make it difficult for small companies and research groups and organizations to access it. At $20 a month, it will be accessible to more people and entities that need reliable access to texts generated by Artificial Intelligence.

Furthermore, this price, and its features, could set the stage for a standard in the capabilities and cost of other Artificial Intelligence chatbots appearing on the market. Of course, from OpenAI they have been quick to make it clear that ChatGPT is not going to become a tool without free options. Its managers have ensured that there will continue to be free access to the chatbot, thanks to the fact that paying users will contribute to it with their monthly payments. For now, they continue to advance in their development, for which they are going to hire new specialized personnel.

Continue Reading

Windows

OpenAI invests in ChatGPT coding and programming with new hires

Avatar of Thomas Grimm

Published

on

OpenAI invierte en la codificacion y programacion de ChatGPT con la contratacion de nuevos profesionales

Despite the massive layoffs taking place in the technology sector, Open AI commitment to the recruitment of personnel for the creation of new artificial intelligence models that allow them to improve their products, among them, ChatGPT. The objective of the company co-founded by Elon Musk would be incorporate 1,000 new workers in different countries around the world to work on creating improved code in their AI tools.

Of those thousand new employees, about 600 (40%) are programmers whose work will focus on AI performing software engineering tasks. The remaining 60% will focus on the ‘data labeling’, that is, in the creation of raw data (text, audio, images and videos) that will then be labeled to specify the context in which the AI ​​is developed. The profile currently most in demand by OpenAI would be that of python developera highly qualified programming language.

This study, included in a Semafor report, shows that new hires will have to create training data that will encompass from lines of code to explanations of it in natural language. OpenAI has a tool with these characteristics launched in 2021, OpenAI Codexwhich received training on data pulled from GitHub (Microsoft’s code repository). Similarly, Codex is used by Microsoft to power GitHub Copilot, a service that will help programmers write code.

Open AI Codex masters more than a dozen programming languages and is capable of interpreting simple commands in natural language and creating an interface for existing applications. Regardless, coding could improve ChatGPT’s ability to converse with humans and thus overcome barriers such as those of the Stack Overflow website, which accuses the OpenAI AI tool of not offering reliable answers.

The key to being able to establish improvements in a coding assistant will be to build a system capable of anticipating the effect of its own actions and set up an internal system. Meta’s chief artificial intelligence scientist, Yann LeCunclaimed that there was a great similarity between cruise control systems in cars and coding assistants like Copilot. And it is that if the human does not establish a constant supervision on Copilot, it will end up making almost negligible code errors. The problem resides when we are dealing with a code model that is sometimes executed and sometimes not.

The OpenAI Selection Process

For the election of qualified personnel regarding the programming and control of ChatGPT, OpenAI subjected the candidates to a five hour unpaid coding test having to undertake different tasks. One of them was to explain in writing how they would solve a coding problem and, in the event of detecting errors in the process, explain what they were and how they should be corrected. In the end, the objective is to feed the tool with very specific training data that the human himself marks for it.

The objective of Elon Musk’s company is that these workers can make AI capable of write code for a more precise method, replacing entry-level programmers in the future. Only then will it be possible to strengthen the power of its AI tools, such as ChatGPT, adapting more quickly and intelligently to changes and being able to face the main competing companies in the sector.

Continue Reading

Trending