CoinSpot is an Australian cryptocurrency exchange that claims to have the largest amount of cryptocurrencies to trade out of all Australian exchanges. That may be true, but is it actually a good exchange? That’s the question that this excellent CoinSpot review seeks to answer, and the the answer is that CoinSpot is only good if you’re sticking to its “Markets” screen! The buy, swap and exchange features on it are grossly expensive, and you’ll be paying out the nose in fees. So before you use CoinSpot, you’d better be aware of the fees they will charge you if you use the wrong screen!
A new membrane which, according to the press release presenting the study published in Nature Nanotechnology, “can produce one hundred times more energy from sea water”, was created by a group of chemists from the University of Leiden. The new membrane is ultra-thin and only one molecule thick.
It is a membrane that, once introduced into the water, can absorb the energy coming from the particles moving from one side to the other, which happens very easily when salt water comes into contact with fresh water because there is an exchange of salt. This is the same process that is usually used to desalinate sea water.
The new membrane developed by the chemists in Leiden, however, is much more efficient because it produces 100 times more energy than “classic” membranes.
The membrane is made of carbon and is porous and thin at the same time, unlike the more common membranes which are either porous or thin.
The membrane was created by spreading oily molecules on the surface of the water. It then formed a film which, once heated, saw the creation of a stable porous membrane.
“The membrane we created is only two nanometres thick and is permeable to potassium ions. We can modify the properties of the membrane using a different molecular block. In this way we can adapt it to meet any requirement,” says Xue Liu, one of the authors of the study and the creators of the membrane together with his colleague Grégory Schneider.
The new membrane is similar to graphene but at the same time different from this material, as Schneider himself explains: “When making a membrane, many researchers start with graphene, which is very thin but not porous. Then they try to drill holes to make it more permeable. We did the opposite by assembling small molecules and building a larger porous membrane from those molecules. Compared to graphene, it contains imperfections, but that’s what gives it its special properties.
For plants, sunlight means life, but sunlight itself can be a double-edged sword for these life forms. Even if they need light to perform photosynthesis, too much sunlight or even too strong an intensity can irreparably damage the leaves and thus the plant.
This is precisely why plants have developed a strategy to defend themselves against too much sunlight: to dissipate this light as heat.
A new study, conducted by Gabriela Schlau-Cohen, Professor of Chemistry at MIT, and her colleagues in collaboration with other Italian institutes, shows how this delicate process takes place.
Through very sensitive spectroscopic analyses, researchers have discovered that the excess energy they receive from the Sun’s rays is transferred through chlorophyll to other pigments called carotenoids.
In this way, it is possible to release the energy acquired through the Sun in the form of heat to the outside world and this prevents damage induced to the plant’s cells by light.
“This is the first direct observation of the transfer of energy from chlorophyll to carotenoid in the light-gathering complex of green plants,” says Schlau-Cohen. “This is the simplest proposal, but no one had been able to find this photophysical pathway until now.”
The researchers also found that environmental conditions can influence the rate of energy dissipation.
This study and a further improved understanding of the natural “photoprotection” system of plants could help in the study of new methods to further maximize yields so that, as Schlau-Cohen herself says, the same crops could be increased by 15-20%.
A team of German researchers identified eight cases of Borna virus (Bornaviridae family), a virus transmitted by shrews, in eight patients hospitalized between 1999 and 2019 for encephalitis. As reported in the study in The Lancet Infectious Diseases, these cases were all recorded in southern Germany. In particular, cases would have spread mainly among people in rural areas.
The virus is transmitted by the shrew (family of soricides), a rat-like mammal that is spread a little bit around the globe. It is considered one of the smallest mammals in existence. According to the researchers, these animals transmitted the virus to domestic cats, which in turn transmitted it to their owners or humans.
Symptoms of the infection include headaches, confusion, fever, convulsions and memory loss as well as, in the most serious cases, loss of consciousness. All eight patients examined by the researchers died between 16 and 57 days after admission.
This is precisely why the researchers believe that Borna virus infection should be considered a serious and potentially lethal disease for humans, as Barbara Schmidt of the University of Regensburg reports. Moreover, it is not known why, it seems to have gone completely unnoticed as far as infections in humans are concerned.
Although rare, this virus could still be the cause of unexplained cases of serious encephalitis, as Martin Beer of the Friedrich-Loeffler Institute, another author of the study, points out.
A new study of the speed at which the universe is expanding seems to resolve, at least in part, the divergences that physicists and cosmologists have achieved by trying to measure it. The new study, published in Physics Letters B, does so without resorting to any “new physics”.
Currently there are two methods used to measure this speed: the first is based on the cosmic microwave background and the data provided in particular by the Planck space mission. According to this first method we obtain a value for the so-called “Hubble constant” of 67.4 (km/s)/Mpc. That is, the universe is expanding 67.4 km/s faster every 3.26 million light years.
The second method is based on supernovae that appear sporadically in distant galaxies. Measuring these strong light events gives a Hubble constant value of 74.
Lucas Lombriser, researcher at UNIGE’s Faculty of Science, says: “These two values have continued to become more precise for many years while remaining different from each other. It didn’t take much to trigger a scientific controversy and even raise the exciting hope that perhaps we were facing a ‘new physics'”.
According to Lombriser, perhaps these differences are due to the fact that in the end the universe is not as homogeneous as it has always been claimed. It has always been difficult to imagine, for example, fluctuations in the average density of matter calculated on volumes thousands of times larger than a galaxy.
This is precisely why Lombriser, in his new study, theorized the existence of a gigantic bubble, 250 million light years in diameter, in which our galaxy is also present and in which the density of matter is significantly lower than the density known for the entire universe.
Such a thing would have an impact on the calculation of Hubble’s constant because this same bubble would include the galaxies that are usually referred to when measuring distances.
So, by establishing that this huge bubble exists and establishing that the density of matter inside it can be 50% lower than that of the rest of the universe, we would obtain a value for the Hubble constant that would converge with the one obtained using the first method, that of the cosmic microwave background: “The probability that there is such a fluctuation on this scale ranges from 1 in 20 to 1 in 5, which means that this is not the imagination of a theorist. There are many regions like ours in the vast universe”.
Tardigrades seem to have a weakness according to a new study published in Scientific Reports and carried out by researchers at the University of Copenhagen. Tardigrades are microorganisms known to be one of the most resistant living species.
This is also due to a characteristic that makes them enter a sort of “suspended animation” during which the body dries out. Just during this phase, it is possible to insert them in practically any environment present on Earth, from those with no oxygen to those with icy temperatures, or even those of space, with its vacuum and cosmic radiations, without these animals suffering any damage.
There are currently 1300 known species and most of them are between 0.3 and 0.5 mm long. They can be found in every environment, from humid to freshwater, from the equator to the poles. The new study focused on a species of tardigrade, the Ramazzottius variornatus, which seems to have a weakness. After having taken several samples of this species from a site in Denmark, the biologist Ricardo Neves of the University of Copenhagen has exposed this species to the high temperature realizing that, especially for those tardigrades which had not had the time to acclimatize to the change of temperature, there was a mortality rate of 50% after only 24 hours and at a temperature of only 37 °C.
Mortality was reduced if the tardigrades were given a short period of acclimatisation of a couple of hours at 30° and another couple of hours at 35°. This is for tardigrades not in “hibernation”: those in “drying” phase showed a higher resistance rate and the mortality rate of 50% was reached after 24 hours at a temperature of 63.1°. As the temperature increased further, the microorganisms began to die faster and faster.
According to Neves himself, this research shows that tardigrades in active phase are vulnerable to high temperatures while those in “dried” phase are more leathery as they can withstand much higher temperatures, almost twice as high, but not at those temperature levels theorized by other previous studies according to which tardigrades can tolerate temperatures up to 151° centigrade. In any case, much depends also on the period of acclimatization that these animals seem to need when talking about temperatures.
An interesting study carried out by researchers at the University of Southampton confirms the existence of large deposits of gas hydrate to bridge the gap between fossil fuels and renewable sources if only the latter were to be used. Gas hydrate, or “gas hydrate,” also known as “burning ice,” is a gas usually stored in large quantities in a solid ice-like form. It consists of water and natural gas (often methane) and is usually found under the sea bed or near the coast.
Recent research had already shown that this gas could play a role in coal replacement in the coming decades, at least until the level of renewable energy is sufficient overall. This study represents a sort of “inventory” of gas hydrate deposits and was created in the context of the European Commission funded project called MIGRATE (Marine Gas Hydrates: An Indigenous Resource of Natural Gas for Europe).
The researchers have identified several sites where there are direct or indirect indications of the presence of hydrated gas. These sites are located on the west and east coasts of Greenland, in the Arctic archipelago of Svalbard, off the coast of Norway and the west of Ireland and in some limited areas of the Mediterranean Sea, the Sea of Marmara and the Black Sea.
“We have found that gas hydrates are particularly widespread around Svalbard, off Norway and in the Black Sea, but the hydrate systems have only been well analyzed in some areas, so there may still be a lot to discover,” says Tim Minshull, a researcher at the University of Southampton who led the study team.
In an article published in the Astrophysical Journal Supplement Series, the discovery of other interesting exoplanets is announced, one of which is a cold Neptunian and two are “potentially habitable” super-Earths. The two potentially habitable planets are those orbiting GJ180 (Gliese 180) and GJ229A (Gliese 229), two red dwarfs relatively close to the Sun: the first is 40 light-years away while the second is 19 light-years away and also has a smaller companion, a brown dwarf (GJ229B).
Both these two planets (GJ180d and GJ229Ac) are superterrestrial, i.e. versions with a larger mass and size than the Earth. The exoplanet orbiting GJ180 has a mass 7.5 times greater than the Earth while the one orbiting GJ229A has a mass 7.9 times greater. The orbital periods are 106 and 122 days respectively.
The cold Neptunian (GJ 433 d), instead, orbits around GJ433 (Gliese 433), another red dwarf located 29.5 light-years away from us. All the planets have been discovered using the radial velocity method, that is the method that detects the imperceptible gravity that the planet impresses on its star, gravity that creates small oscillations that can be revealed with advanced instruments from Earth.
The red dwarfs are among the most common stars in our galaxy and, although they are less bright and smaller than our Sun, they can still boast planets that orbit in the so-called “habitable zone”, that orbital area within which a planet can have liquid water on its surface thanks to an almost mild temperature. Usually, the planets that orbit around the red dwarfs always have the same face facing the star, something that splits the planet in two: a hot area and a very cold area, which of course is not favorable for life (although it does not completely exclude it according to different theories).
GJ180d, instead, orbits as it orbits the Earth around the Sun, therefore not always showing the same face facing the star, which raises the probability of habitability and therefore of possible presence of life. GJ229Ac is instead interesting because, besides orbiting around the red dwarf, it also orbits around the companion of the latter, a brown dwarf. It is one of the first exoplanets ever identified that orbit around a brown dwarf, which may help to untangle the doubts about the possibility that such a star can also host planets around it in non-binary systems.
The ultimate goal, as Jeff Crane, one of the authors of the study, adds, is to build a detailed map of all the planets orbiting the stars closest to the Sun so that with future space telescopes these planets can be properly analyzed, especially those potentially habitable.
The ocean heat during 2019 was record-breaking according to a new study in Advances in Atmospheric Sciences. In 2019, the oceans were warmer than any other measurement made in human history and this mainly concerns the stretch from the surface to a depth of 2000 meters.
The study, produced by an international team, shows that the global temperature of the oceans is not simply increasing but is even accelerating. In fact, the temperature of the oceans during 2019 was 0.075 °C above the average between 1981 and 2010.
In 2019 alone, on average, the world’s oceans absorbed an amount of heat equivalent to 3.6 billion Hiroshima atomic bomb explosions: this is the comparison made by Lijing Cheng, principal author of the study, to help us understand the absorption of heat, calculated in 228 sextillion joules, by the oceans in 2019.
“This measured ocean warming is irrefutable and is further evidence of global warming. There are no reasonable alternatives other than human gas emissions trapping heat to explain this warming,” says the researcher who is an associate professor at the Institute of Atmospheric Physics of the Chinese Academy of Sciences. The researchers have analyzed heat trends in the global oceans since the 1950s. Among the data are also those recorded by the US National Oceanic and Atmospheric Administration (NOAA).
According to John Abraham, Professor of Mechanical Engineering at St. Thomas University in the United States, another author of the study, the oceans are the place where the vast majority of the heat produced outside of it goes: “If you want to understand global warming, you have to measure the warming of the oceans.”
The effects of warming the oceans at these levels are already being felt through extreme weather conditions that go beyond the sea level rise itself and manifest themselves in hurricanes and storms that can, in turn, lead to very serious damage even in economic terms. And according to Abraham, these effects would only be the “tip of the iceberg” because the situation is getting worse.
More than 5,000 wild camels are reportedly killed, mainly by snipers and helicopter snipers, in fire-ravaged South Australia, according to an AFP press release. These wild herds of non-native animals are in fact devastating the indigenous communities already affected by the drought caused by the huge fires that are developing in recent weeks.
Aboriginal people would suffer most from these wandering herds of camels which, we recall, are not native to Australia but have been imported by humans over the past centuries. Precisely because of the extreme heat produced by the fires, these animals are migrating from one area to another in search of water, damaging the water infrastructure but also proving to be a real danger to vehicle drivers.
Despite the concerns announced by activists and animal rights activists, sniper teams have continued to shoot down these animals, which are now considered to be real parasites introduced into an environment that is not theirs. In addition to consuming huge amounts of precious water for the locals, especially the native groups, these camels often die right near the Water Sources.
This means that their bodies are rotting and irreparably pollute the water sources themselves, making them no longer usable not only by humans but also by other animals, especially birds. And the prolonged drought is only making this situation so much worse that mass culling has proved to be the only solution, although it may seem the most brutal.
Camels in Australia were introduced around 1840 as they were considered very useful for exploring the vast inland areas of this continent. Camels in fact adapt very well to the heat and very dry regions of the Australian outback. Over the following decades, tens of thousands of camels were imported, which naturally reproduced and over time multiplied in some ways without control.