Wreck of Russian warship found...believed to hold Gold worth $130 billion

  • A South Korean salvage team has reportedly discovered the wreck of a Russian warship
  • It's believed to still contain 200 tons of gold bullion worth 150 trillion won ($130 billion)
  • The Russian Imperial Navy cruiser Dmitrii Donskoi was sunk 113 years ago


A South Korean salvage team has reportedly discovered the wreck of a Russian warship that is believed to still contain 200 tons of gold bullion and coins worth 150 trillion won ($130 billion).

The Russian Imperial Navy cruiser Dmitrii Donskoi, which was sunk in a naval battle 113 years ago, was discovered at a depth of more than 1,400 feet about one mile off the South Korean island of Ulleungdo, according to The Daily Telegraph.

The U.K. newspaper reported that a joint team made up of experts from South Korea, Britain and Canada discovered the wreck on Sunday. They had used two manned submersibles to capture footage of the vessel.

The images caught by the submersibles show “extensive damage to the vessel caused in an encounter with Japanese warships in May 1905, along with cannons and deck guns encrusted with marine growth, the anchor and the ship’s wheel,” the Telegraph reported.

There are reports that the Dmitrii Donskoi, which was scuttled during the Russo-Japanese war in 1905, went down with 5,500 boxes of gold bars and coins still in its holds to stop the Japanese seizing it.

The Seoul-based Shinil Group, which led the exploration that found the ship, hopes to raise it in October or November. It estimates the gold would have a value today of around $130 billion.

Half of any treasure found aboard the vessel would be handed over to the Russian government, the company said, while 10 percent of the remainder will be invested in tourism projects on Ulleungdo Island, including a museum dedicated to the vessel, The Daily Telegraph reported.

Emerging Stem Cell Ethics




Douglas Sipp, Megan Munsie, Jeremy Sugarman

It has been 20 years since the first derivation of human embryonic stem cells. That milestone marked the start of a scientific and public fascination with stem cells, not just for their biological properties but also for their potentially transformative medical uses. The next two decades of stem cell research animated an array of bioethical debates, from the destruction of embryos to derive stem cells to the creation of human-animal hybrids. Ethical tensions related to stem cell clinical translation and regulatory policy are now center stage and a topic of global discussion this week at the International Society for Stem Cell Research (ISSCR) annual meeting in Melbourne, Australia. Care must be taken to ensure that entry of stem cell–based products into the medical marketplace does not come at too high a human or monetary price.

Despite great strides in understanding stem cell biology, very few stem cell–based therapeutics are as yet used in standard clinical practice. Some countries have responded to patient demand and the imperatives of economic competition by promulgating policies to hasten market entry of stem cell–based treatments. Japan, for example, created a conditional approvals scheme for regenerative medicine products and has already put one stem cell treatment on the market based on preliminary evidence of efficacy. Italy provisionally approved a stem cell product under an existing European Union early access program. And last year, the United States introduced an expedited review program to smooth the path for investigational stem cell–based applications, at least 16 of which have been granted already. However, early and perhaps premature access to experimental interventions has uncertain consequences for patients and health systems.

A staggering amount of public money has been spent on stem cell research globally. Those seeking to develop stem cell products may now not only leverage that valuable body of resulting scientific knowledge but also find that their costs for clinical testing are markedly reduced by deregulation. How should this influence affordability and access? The state and the taxpaying public's interests should arguably be reflected in the pricing of stem cell products that were developed through publicly funded research and the regulatory subsidies. Detailed programs for recouping taxpayers' investments in stem cell research and development must be established.

Rushing new commercial stem cell products into the market also entails considerations inherent to the ethics of using pharmaceuticals and medical devices. For example, once a product is approved for a given indication, it becomes possible for physicians to prescribe it for “off-label use.” We have already witnessed the untoward effects of the elevated expectations that stem cells can serve as a kind of cellular panacea, a misconception that underlies the direct-to-consumer marketing of unproven uses of stem cells. Once off-label use of approved products becomes an option, there may be a new flood of untested therapeutic claims with which to contend. The ISSCR and the United States Federation of State Medical Boards have both recently issued guidelines on clinical translation and use, but adoption and enforcement remain key issues.

The new frontiers of stem cell–based medicine also raise questions about the use of fast-tracked products. In countries where healthcare is not considered a public good, who should pay for post-market efficacy testing? Patients already bear a substantial burden of risk when they volunteer for experimental interventions. Frameworks that ask them to pay to participate in medical research warrant much closer scrutiny than has been seen thus far.

Striking the proper balance between streamlining review processes and ensuring that there is sufficient evidence before bringing products into clinical use is a perennial predicament for patients, payers, scientists, clinicians, and regulators. For stem cell treatments, attaining this balance will require frank and open discussion between all stakeholders, including the patients it seeks to benefit and the taxpayers who make it possible.

George Orwell would love this...

Genealogy databases and the future of criminal investigation



The 24 April 2018 arrest of Joseph James DeAngelo as the alleged Golden State Killer, suspected of more than a dozen murders and 50 rapes in California, has raised serious societal questions related to personal privacy. The break in the case came when investigators compared DNA recovered from victims and crime scenes to other DNA profiles searchable in a free genealogical database called GEDmatch. This presents a different situation from the analysis of DNA of individuals arrested or convicted of certain crimes, which has been collected in the U.S. National DNA Index System (NDIS) for forensic purposes since 1989. The search of a nonforensic database for law enforcement purposes has caught public attention, with many wondering how common such searches are, whether they are legal, and what consumers can do to protect themselves and their families from prying police eyes. Investigators are already rushing to make similar searches of GEDmatch in other cases, making ethical and legal inquiry into such use urgent.

In the United States, every state, as well as the federal government, has enacted laws enumerating which convicted or arrested persons are subject to compulsory DNA sampling and inclusion in the NDIS database. The NDIS contains more than 12 million profiles, and it is regularly used to match DNA from crime scenes to identify potential suspects. It is only helpful, however, if the suspect—or a family member of the suspect—has been arrested or committed a crime and their DNA has been collected and stored.

The case of the Golden State Killer is not the first instance of investigators turning to nonforensic DNA databases to generate leads. This was not even the first time investigators used genealogical DNA matches to develop and pursue a suspect in the Golden State Killer case itself. A year before investigators zeroed in on DeAngelo, they subpoenaed another genetic testing company for the name and payment information of one of its users and obtained a warrant for the man's DNA. He was not a match. Similarly, in 2014, Michael Usry found himself the target of a police investigation stemming from a partial genetic match between his father's DNA, stored in an Ancestry.com database, and DNA left at a 1996 murder scene. On the basis of the partial match, police were able to obtain a court order requiring Ancestry.com to disclose the identity of the database DNA match. After mapping out several generations of Usry's father's family, investigators zeroed in on Usry, eventually securing a warrant for his DNA. Ultimately, Usry was cleared as a suspect when his DNA proved not to match the crime scene DNA.

But there have also been reported successes. In 2015, for example, Arizona police arrested and charged Bryan Patrick Miller in the Canal Killer murders based in part on a tip drawn from a genealogical database search (1). Searches like these, drawing on genetic information unrelated to the criminal justice system, may offer substantial benefits. Allowing police to conduct similar database searches in other cases is likely to lead to more solved crimes. Moreover, expanding law enforcement investigations to encompass genealogical databases may help to remedy the racial and ethnic disparities that plague traditional forensic searches. In accordance with state laws, official forensic databases are typically limited to individuals arrested or convicted of certain crimes. Racial and ethnic disparities throughout the criminal justice system are therefore reproduced in the racial and ethnic makeup of these forensic databases. Genealogical databases, by contrast, are biased toward different demographics. The 23andMe database, for instance, consists disproportionately of individuals of European descent. Including genealogical databases in forensic searches might thus begin to redress, in at least one respect, disparities in the criminal justice system.

There are few legal roadblocks to police use of genetic databases intended to help individuals explore their health or identify genetic relatives. The Fourth Amendment's protection against warrantless searches and seizures generally does not apply to material or data voluntarily shared with a third party, like a direct-to-consumer genetics testing or interpretation company or a genetic matching platform like GEDmatch. Once an individual has voluntarily shared her data with a third party, she typically cannot claim any expectation of privacy in those data—and so the government need not secure a warrant before searching it.

Beyond the Constitution, three federal laws protect some genetic data against certain disclosures, but these too are unlikely to provide an effective shield against law enforcement searches in nonforensic genetic databases. The Genetic Information Nondiscrimination Act (GINA) protects genetic data, but only against certain uses by employers and health insurers.. GINA provides no protection against law enforcement searches. Similarly, most companies and websites offering DNA testing, interpretation, or matching services directly to individuals likely are not covered by the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule, which governs the use and disclosure of identifiable health information. These providers are usually careful to explain that they are not engaged in health care or the manipulation or provision of health data. Finally, although certificates of confidentiality protect scientific researchers from disclosing data to law enforcement—even against a warrant —they do not extend to scenarios in which law enforcement is just another contributor to and user of online genetic resources, such as public databases and matching tools. Certificates of confidentiality have faced few challenges in court, and so it is also uncertain whether the protection they purport to provide will hold up against a challenge by law enforcement seeking access.

Consistent with this legal landscape, companies and websites that generate, interpret, or match genetic data directly for individuals often do not promise complete protection. In terms of law enforcement, for instance, 23andMe states in its privacy policy, “23andMe will preserve and disclose any and all information to law enforcement agencies or others if required to do so by law or in the good faith belief that such preservation or disclosure is reasonably necessary to…comply with legal or regulatory process (such as a judicial proceeding, court order, or government inquiry)…”. Ancestry.com similarly discloses, “We may share your Personal Information if we believe it is reasonably necessary to: [c]omply with valid legal process (e.g., subpoenas, warrants)…” . And in the wake of the Golden State Killer arrest, GEDmatch has altered its terms of service to explicitly permit law enforcement use of its database to investigate homicides and sexual assault. Although these disclaimers are usually unambiguous, they are sometimes buried in terms of service or privacy policies that many individuals do not take care to read or fully understand.

Despite the lack of legal protection against law enforcement searches of nonforensic databases, such searches may run counter to core values of American law. The Fourth Amendment is a constitutional commitment to protect fundamental civil rights. Part of that is a commitment to protecting privacy or freedom from government surveillance. Police cannot search a house without suspecting a specific individual of particular acts—even if doing so would enable the police to solve many more crimes. Yet, database searches permit law enforcement to search the genetic data of each database member without any suspicion that a particular member is tied to a particular crime. Although the U.S. Supreme Court has approved suspicionless genetic searches for individuals with diminished expectations of privacy, like those arrested or convicted of crimes, ordinary members of the public are different. Familial searches, like those used in the Golden State Killer investigation, are an even further departure from the Supreme Court standard. Certainly, individuals who commit crimes and leave their DNA behind forfeit any expectation of privacy in that DNA. But a usable forensic identification requires two matching parts: a crime scene sample and a database profile that matches it. Suspects identified through familial searches cannot be said to have voluntarily shared their genetic profile in a database of known individuals, even if a genetic relative has.

The Supreme Court is poised to reconsider its broad rule that the voluntary sharing of data negates expectations of privacy—and thus negates Fourth Amendment protections against warrantless government searches. In Carpenter v. United States, the Supreme Court will determine whether police must obtain a warrant to justify access to historical cell phone records revealing the movements and location of a cell phone user over a long period of time (9). In the digital age, in which nearly all data are at least nominally shared with third parties like internet service providers, website hosts, and cell phone companies, the current rule means that the Fourth Amendment often does not apply. Carpenter may reshape this rule to account for the realities of a big-data world. A ruling in Carpenter that limits police use of historical cell phone data may substantially affect police practices surrounding genetic data as well, as merely sharing data with another might well be insufficient to permit its suspicionless search by the government for crime-detection purposes.

Even if the Supreme Court decision in Carpenter does not revamp Fourth Amendment rules governing police access to shared data, the setting of that case suggests another way to resolve concerns about police access to nonforensic genetic databases. In the Stored Communications Act, Congress provided substantial statutory protection for email and other digital information maintained on the internet. Under the act, a court may order disclosure of electronic records if the government “offers specific and articulable facts showing that there are reasonable grounds to believe” that the records sought “are relevant and material to an ongoing criminal investigation” (10). This standard is less onerous than the Fourth Amendment's warrant requirement, but it is notably more demanding than any protections the law currently provides.

Enacting similar protection for genetic data stored in nonforensic databases would ensure that the government cannot subject ordinary individuals to suspicionless genetic searches, while allowing investigators to access genetic data where there is reason to believe a particular individual may be tied to a particular crime. A Stored Genetics Act would likely render law enforcement searches of nonforensic genetic databases unlawful for crime-detection purposes, as there can be no “specific and articulable” connection between particular database records and a particular crime when investigators seek to use such a search to generate leads, not investigate them. Thus, although such an approach would preserve freedom from perpetual genetic surveillance by the government, it may well result in fewer solved cases.

Legislatures may understandably be loath to enact a total prohibition of such searches. At a minimum, however, policy-makers should delineate under what circumstances such searches are acceptable. For example, several states, including California, Colorado, and Texas, have identified prerequisites to the use of familial searches of the state's own forensic database, including that the crime to be investigated is serious and that traditional investigative techniques have been exhausted without success. Similar constraints could be placed on law enforcement searches of nonforensic databases. The challenge of this approach is that limitations on the scope of use can erode quickly. Thus, although Colorado's policy governing familial searches of the state's forensic database limits such searches to crimes with “significant public safety concerns,” police in that state used a familial search to solve a car break-in where the perpetrator “left a drop of blood on a passenger seat when he broke a car window and stole $1.40 in change”. The erosion of limits on crime-solving technology may well be inevitable, and it threatens our collective civil liberties and opens the door to socially and politically unacceptable genetic surveillance.

Whatever legislative solution is adopted, it must at least take into account public perspectives to clearly delineate acceptable uses and balance the social benefit of solving cases with individuals' interests in avoiding unwarranted government scrutiny.

Golf Icon

mike austin.png

In honor of US Open this week I have decided to honor a true legend that most, even golf fans, have never heard of..

You probably never heard of Mike Austin. He was a professional golfer whose career spanned half a century from the 1930s through the 1980s. It is both amazing and sad that this phenomenal man has fallen into obscurity. He had a career that included being an intelligence officer and pilot in WW2, acting in movies, teaching golf to celebrities like Howard Hughes and Jack Lalane, and he even had his own TV show.

Austin was known for his exceptionally long drives. So much so that he hated the PGA because he found out that other golfers were bribing the officials at tournaments to not be paired with them. They were afraid of being embarrassed by him. With his 4-iron he could hit the ball further than the other players would hit their drivers. Austin also said they would always make him go out in the first group of the day...in the morning when the fairways were wet and would prevent the ball from gaining distance by not rolling far in the grass.

In 1974 at the U.S. National Seniors Tournament at the Winterwood Golf Course in Nevada Austin he hit the longest recorded drive by a professional in PGA history.  On a 450 yard par 4, he drove the green and the ball kept rolling. When it stopped it was an astonishing 515 yards from the tee. Unreal. (Then he hit a wedge onto the green and three putted for bogie)

Of course there are some things that helped him. There was a 20 mph tailwind and the balls tend to roll far on dry desert fair ways. But this was also in 1974 with that eras technology. With modern clubs and balls it is very possible that he could have hit it even farther. The long hitters on todays tour, like Doosh-ba Watson (more on him later in the week) and Dustin Johnson drive the ball about 320 yards...a full two football fields shorter.

Mike Austin could swing his driver at an incredible 155 mph! To put this into perspective, the best guys that you see on TV are swinging their drivers around 115 or 120 mph. Even in the long drive contests where the competitors are like 6'8" and built like linebackers, those guys are 'only' swinging around 135 mph. Austin said he learned how to develop this incredible speed when he was earning his PHD in Kinesthesiology and the Mechanics of the Golf Swing. The guy was true Bad-Ass.

Check this out...its from the TV show he used to have.




Gerolamo Cardano


Gerolamo (or Girolamo, or Geronimo)  24 September 1501 – 21 September 1576) was an Italian polymath, whose interests and proficiencies ranged from being a mathematician, physician, biologist, physicist, chemist, astrologer, astronomer, philosopher, writer, and gambler. He was one of the most influential mathematicians of the Renaissance, and was one of the key figures in the foundation of probability and the earliest introducer of the binomial coefficients and the binomial theorem in the western world. He wrote more than 200 works on science.

Cardano wanted to practice medicine in a large, rich city like Milan, but he was denied a license to practice, so he settled for the town of Saccolongo, where he practiced without a license. There, he married Lucia Banderini in 1531. Before her death in 1546, they had three children, Giovanni Battista (1534), Chiara (1537) and Aldo (1543).[6] Cardano later wrote that those were the happiest days of his life.

With the help of a few noblemen, Cardano obtained a teaching position in mathematics in Milan. Having finally received his medical license, he practiced mathematics and medicine simultaneously, treating a few influential patients in the process. Because of this, he became one of the most sought-after doctors in Milan. In fact, by 1536, he was able to quit his teaching position, although he was still interested in mathematics. His notability in the medical field was such that the aristocracy tried to lure him out of Milan. Cardano later wrote that he turned down offers from the kings of Denmark and France, and the Queen of Scotland.

Cardano was the first mathematician to make systematic use of numbers less than zero. He published with attribution the solution of Scipione del Ferro to the cubic equation and the solution of his student Lodovico Ferrari to the quartic equation in his 1545 book Ars Magna. The solution to one particular case of the cubic equation a x 3 + b x + c = 0 {\displaystyle ax^+bx+c=0}  (in modern notation), had been communicated to him in 1539 by Niccolò Fontana Tartaglia (who later claimed that Cardano had sworn not to reveal it, and engaged Cardano in a decade-long dispute) in the form of a poem, but Ferro's solution predated Fontana's. In his exposition, he acknowledged the existence of what are now called imaginary numbers, although he did not understand their properties, described for the first time by his Italian contemporary Rafael Bombelli. In Opus novum de proportionibus he introduced the binomial coefficients and the binomial theorem.

Cardano was notoriously short of money and kept himself solvent by being an accomplished gambler and chess player. His book about games of chance, Liber de ludo aleae ("Book on Games of Chance"), written around 1564, but not published until 1663, contains the first systematic treatment of probability, as well as a section on effective cheating methods. He used the game of throwing dice to understand the basic concepts of probability. He demonstrated the efficacy of defining odds as the ratio of favourable to unfavourable outcomes (which implies that the probability of an event is given by the ratio of favourable outcomes to the total number of possible outcomes. He was also aware of the multiplication rule for independent events but was not certain about what values should be multiplied.


Circle Lake / Floating Island Mystery

I saw a show about this on TV last night and I thought it was pretty cool.

Located near the northeastern edge of Argentina, in the swampy marshes of Parana Delta, is an enigmatic floating island that allegedly rotates on its own axis. Nicknamed “The Eye”, the nearly perfect circular island has become the subject of an upcoming documentary that will try to unravel the mystery of its existence.

The Eye was discovered six months ago by Argentine film director & producer Sergio Neuspillerm, who was looking for filming locations for a film about paranormal occurrences, like ghost and alien sightings, in the area. After spotting the unusually round island surrounded by an equally round body of water on Google Earth, Neuspillerm and his crew knew they had stumbled upon something truly special, so they abandoned their original film project and decided to focus on this mystery instead.


“When locating this reference in the map we discovered something unexpected that left the film project in the background, we call it ‘The Eye’,” Neuspillerm said in a video. “The Eye is a circle of land surrounded by a thin channel of water with a diameter of 130 yards. Both circles [the water and land] are so perfect that it is hard to believe that this is a natural formation.”

Neuspillerm soon teamed up with Richard Petroni, a hydraulic and civil engineer from New York, and tech expert Pablo Martinez and together journeyed to Parana Delta to see The Eye first hand. “The place was amazing and extremely strange. We discovered that the water is incredibly clear and cold, something totally unusual in the area,” the filmmaker said. “The bottom is hard in contrast to the swampy marshes surrounding it and the center part floats. We don’t know over what, but it floats.”

Their expedition brought up more questions than answers, so the trio of explorers recently set up a Kickstarter campaign to crowdfund a second expedition to The Eye and hopefully learn more about its origins and purpose. They’re asking for $50,000, of which they have raised $8,800, with 28 days to go. For a pledge of $25, backers will be invited to watch the upcoming documentary on The Eye online and see all the white papers from the analysis and tests conducted during the investigation. However, for a $10,000 contribution, you get to accompany the team on their historical expedition, while $5,000 guarantees you a spot on a subsequent visit to the enigmatic island.


“We want to return with a complete scientific expedition having scuba gear, geologists, biologists, ufologists, specialized drones and more, and take samples of the water, soil, plants an all other objects we may find,” Neuspillerm said about their project, called ‘Elojo’ (The Eye). “We want to understand The Eye’s relation with supernatural stories told by the locals.”

The Eye has apparently been visible on Google Maps – at coordinates 34°15’07.8’S 58°49’47.4″W – for the last decade, but until the Elojo project went public, no one ever paid it any mind. Now, the internet is buzzing with theories about its existence. Most are associating it with UFO activity and go as far as to claim that the rotating island is concealing an alien base, while the locals believe that its circle-within-a-circle shape represents the presence of God on Earth.

Pablo Suarez, who does dynamic systems modeling at Boston University, allegedly told Paranormal News that he has never seen anything like The Eye before. He added that the almost perfect circular shape makes it unlikely to be simple crater or a formation created by a typical natural phenomenon.

However, the same website recently received a message from someone named Daniel Roy Finkley who claims that The Eye is no mystery. It is apparently one of dozens of formations with more or less irregular edges. He claims that they are actually a characteristic natural environment of the coast. There’s also a YouTube video that shows several similar formations around Argentina.



Hubble spots farthest star ever seen

The blistering blue star, which existed almost 10 billion years ago, was imaged thanks to a chance alignment that magnified it by a factor of at least 2,000.



In a study published today in Nature Astronomy, an international team of researchers announced the discovery of the most distant star ever observed. The team detected the blue supergiant star — which shone when the universe was just one-third its current age — with the help of both the Hubble Space Telescope and an observational phenomenon known as gravitational lensing.

“This is the first time we’re seeing a magnified, individual star,” said Patrick Kelly, an astrophysicist of the University of Minnesota and lead author of the new study, in a press release. “You can see individual galaxies out there, but this star is at least 100 times farther away than the next individual star we can study, except for supernova explosions.” The unique discovery not only provides astronomers with insight into the formation and evolution of stars in the early universe, but also the composition of galaxy clusters, and even the very nature of dark matter itself.

The light from the record-breaking star, which the team has since nicknamed Icarus, was emitted just 4.4 billion years after the Big Bang. Although the star was undoubtedly bright, being located at such a great distance away would have typically made it impossible to view, even with our most powerful telescopes. Fortunately, “the star became bright enough to be visible for Hubble thanks to a process called gravitational lensing,” said co-author Jose Diego, an astronomer from the Instituto de Física de Cantabria, in a press release.

Gravitational lensing is an effect that is predicted by Einstein’s general theory of relatively. It occurs when diverging light rays from a distant object are bent back inward, or lensed, as they pass by an extremely massive object, such as a galaxy cluster. According to the study, when a galaxy cluster serendipitously wanders directly between Earth and a distant background object, gravitational lensing can magnify the distant object by up to a factor of about 50. Furthermore, if there is a smaller, impeccably aligned object within the lensing galaxy cluster, then the background object can be magnified (in a process called gravitational microlensing) by a factor of up to 5,000.


The team initially discovered Icarus while using Hubble to detect and track a known supernova named Refsdal, whose light was predicted to soon be gravitationally lensed by the galaxy cluster MACS J1149, located some 5 billion light-years away. But during their observations, the team was surprised to find another point source was unexpectedly growing brighter within the same field as the expected supernova. While waiting for Refsdal to undergo its predicted lensing event, the researchers accidentally stumbled upon a new star: Icarus.

After spotting Icarus, the researchers used Hubble again to measure the star’s spectrum. By breaking down the star’s light into its constituent colors, the team determined that while Icarus was getting brighter, it was not getting hotter. This meant the star was not another supernova like Refsdal, but instead was a distant, non-exploding star that was being not only lensed by the intervening galaxy cluster, but also microlensed by another small, yet massive object within the cluster.

“We know that the microlensing was caused by either a star, a neutron star, or a stellar-mass black hole,” said co-author Steven Rodney from the University of South Carolina, in a press release. Therefore, the discovery of Icarus allows astronomers to gather new insights into the makeup of the galaxy cluster itself, he explained. Considering galaxy clusters are some of the most massive and sprawling structures in our universe, learning more about their makeup will inevitably help increase our overall understanding of the universe.

Furthermore, the newly discovered star may also help shed light on one of the most mysterious materials in our universe — dark matter. “If dark matter is at least partially made up of comparatively low-mass black holes, as it was recently proposed, we should be able to see this in the light curve of [Icarus],” said Kelly. “Our observations do not favor the possibility that a high fraction of dark matter is made of these primordial black holes with about 30 times the mass of the Sun.”

No matter what astronomers are able to glean from the distant Icarus, this chance discovery of an extremely distant and magnified star is not likely to be the last. With the upcoming launch of modern, more-powerful telescopes like the James Webb Space Telescope, astronomers are optimistic that microlensing events like this may allow them to study the evolution of the universe's earliest stars in unprecedented detail.



Edwin Powell Hubble was an American astronomer. He played a crucial role in establishing the fields of extragalactic astronomy and observational cosmology and is regarded as one of the most important astronomers of all time.






Certain things get better with age...here is a list of 5 of the most expensive bottles of wine ever sold...


1. CHÂTEAU LAFITE, 1787 — $156,450

Okay, so, yes, 1787 is ancient, especially considering this bottle of Bordeaux at this price was sold in 1985. But don't forget, even the best Bordeaux only lasts about 50 years. So 200 years? Forget about it! Why the hefty price tag? Well, this particular bottle had the initial Th.J. etched into it. That's right, Jefferson was a hard-core oenophile. During the time that he served as ambassador to France, he often traipsed out to Bordeaux and Burgundy looking for wine for his cellar back stateside. His initials etched into two other bottles have also fetched pricey sums: A 1775 Sherry that fetched $43,500, and — ready for this? — the most expensive bottle of white wine ever sold, a 1787 Chateau d'Yquem for $56,588.

Price per glass: $26,075


Okay, so now you're confused, right? First I said the most expensive bottle ever was about $160K and now at number two I've listed one that cost almost twice that. Three sheets to the wind? Not at all. See, this bottle of red that sold in 2007 was a large bottle, not a standard-size. But take a look down below at the price per glass and you'll see which is truly the more expensive of the two. Had this giant bottle been a standard 750 ml bottle, it would have only sold for $51,783. (By the way, 1945 is considered one of the very best vintages of the 20th century and Mouton-Rothschild one the world's greatest clarets. If you ever happen upon a bottle, don't drink it!)

Price per glass: $8,631


Sold in 2004, this Cabernet is regarded as the most expensive bottle of American wine ever sold. Inglenook is now known as Rubicon and owned by Francis Ford Coppola, who is said to keep one of them (empty) on top of his refrigerator. "It was one of the best I'd ever had," he has said about the wine. So how did it taste? "There is a signature violet and rose petal aroma that completes this amazingly well-preserved, robust wine that had just finished fermentation at the time of Pearl Harbor." Talk about seeing the glass half-full.

Price per glass: $4,113

4. CHÂTEAU MARGAUX, 1787 — $225,000

There I go again. And this is a standard 750 ml bottle. So what's it doing buried way down here? Well, this bottle actually resides in the Most Expensive Bottle of Wine Never Sold category. That's right, I said never sold.

In 1989, the bottle collided with a tray at a wine dinner and New York wine merchant William Sokolin collected $225,000 from insurance! (He was seeking a whopping half a million for the bottle, which, they claimed, had also been owned by Thomas Jefferson.)

Price per glass: $37,500

5. KRUG, 1928 — $21,200

The champagne record has been broken often in the last decade. In 2005, it was a bottle of Krug 1953 that went for $12,925. Then, that same year, a Methuselah (6 liter bottle) of Louis Roederer, Cristal Brut 1990, Millenium 2000 sold for $17,625. Finally, the Krug 1928 75cl bottle was sold at Acker Merrall & Condit's first Hong Kong auction in 2009. Must be some sort of bubbly!


The 1938 New England Hurricane



The 1938 New England Hurricane (also referred to as the Great New England Hurricane and Long Island Express) was one of the deadliest and most destructive tropical cyclones to strike Long Island, New York and New England. The storm formed near the coast of Africa on September 9, becoming a Category 5 hurricane on the Saffir-Simpson Hurricane Scale before making landfall as a Category 3 hurricane on Long Island on September 21. It is estimated that the hurricane killed 682 people,damaged or destroyed more than 57,000 homes, and caused property losses estimated at US $306 million ($4.7 billion in 2017 Damaged trees and buildings were still seen in the affected areas as late as 1951. It remains the most powerful and deadliest hurricane in recorded New England history, eclipsed in landfall intensity perhaps only by the Great Colonial Hurricane of 1635

The storm was first analyzed by ship data south of the Cape Verde Islands on September 9. Over the next ten days, it steadily gathered strength and slowly tracked to the west-northwest; it is estimated to have reached Category 5 intensity by September 20, while centered east of the Bahamas. It then veered northward in response to a deep trough over Appalachia, sparing the Bahamas, Florida, the Carolinas, and the Mid-Atlantic states. At the same time, a high pressure system was centered north of Bermuda, preventing the hurricane from making an eastward turn out to sea.

Thus, the hurricane was effectively squeezed to the north between the two weather systems. This caused the storm's forward speed to increase substantially late on September 20, ultimately reaching 70 mph, the highest forward velocity ever recorded in the annals of hurricanes. This extreme forward motion was in the same general direction as the winds on the eastern side of the storm as it proceeded north; this, in turn, caused the wind speed to be far higher in areas east of the storm's eye than would be the case with a hurricane of more typical forward speed.

The storm was centered several hundred miles to the southeast of Cape Hatteras during the early hours of September 21, and it weakened slightly. By 8:30 am EDT, it was centered approximately 100 miles (160 km) due east of Cape Hatteras, and its forward speed had increased to well over 50 mph. This rapid movement did not permit enough time for the storm to weaken over the cooler waters before it reached Long Island. The hurricane sped through the Virginia tidewater during the 9:00 am hour. Between 12:00 pm and 2:00 pm, the New Jersey coastline and New York City caught the western edge. At the same time, weather conditions began to deteriorate rapidly on Long Island and along the southern New England coast.

The hurricane made landfall at Bellport on Long Island's Suffolk County sometime between 2:10 pm and 2:40 pm as a Category 3 hurricane, with sustained winds of 120 mph.It made a second landfall as a Category 3 hurricane somewhere between Bridgeport and New Haven, Connecticut at around 4:00 pm, with sustained winds of 115 mph.

The storm's eye moved into western Massachusetts by 5:00 pm, and it reached Vermont by 6:00 pm. Both Westfield, Massachusetts and Dorset, Vermont reported calm conditions and partial clearing during passage of the eye, which is a rather unusual occurrence for a New England hurricane. The hurricane began to lose tropical characteristics as it continued into northern Vermont, though it was still carrying hurricane-force winds. It crossed into Quebec at approximately 10:00 pm, transitioning into a post-tropical low. The post-tropical remnants dissipated over northern Ontario a few days later.



Forecasting the storm

In 1938, United States forecasting lagged behind forecasting in Europe, where new techniques were being used to analyze air masses, taking into account the influence of fronts. A confidential report was released by the United States Forest Service, the parent agency of the United States Weather Bureau. It described the weather bureau's forecasting as "a sorry state of affairs" where forecasters had poor training and systematic planning was not used, and where forecasters had to "scrape by" to get information wherever they could. The Jacksonville, Florida, office of the weather bureau issued a warning on September 19 that a hurricane might hit Florida. Residents and authorities made extensive preparations, as they had endured the Labor Day Hurricane three years earlier. When the storm turned north, the office issued warnings for the Carolina coast and transferred authority to the bureau's headquarters in Washington.

At 9:00 am on September 21, the Washington office issued northeast storm warnings north of Atlantic City and south of Block Island, Rhode Island, and southeast storm warnings from Block Island to Eastport, Maine. The advisory, however, underestimated the storm's intensity and said that it was farther south than it actually was. The office had yet to forward any information about the hurricane to the New York City office. At 10:00 am, the bureau downgraded the hurricane to a tropical storm. The 11:30 am advisory mentioned gale-force winds but nothing about a tropical storm or hurricane.

That day, 28 year-old rookie Charles Pierce was standing in for two veteran meteorologists. He concluded that the storm would be squeezed between a high-pressure area located to the west and a high-pressure area to the east, and that it would be forced to ride up a trough of low pressure into New England. A noon meeting was called and Pierce presented his conclusion, but he was overruled by "celebrated" chief forecaster Charles Mitchell and his senior staff. In Boston, meteorologist E.B. Rideout told his WEEI radio listeners (to the skepticism of his peers) that the hurricane would hit New England. At 2:00 pm, hurricane-force gusts were occurring on Long Island's South Shore and near hurricane-force gusts on the coast of Connecticut. The Washington office issued an advisory saying that the storm was 75 miles east-southeast of Atlantic City and would pass over Long Island and Connecticut. Re-analysis of the storm suggests that the hurricane was farther north (just 50 miles from Fire Island), and that it was stronger and larger than the advisory said.





The majority of the storm damage was from storm surge and wind. Damage was estimated at $308 million, the equivalent of $5.1 billion adjusted for inflation in 2016 dollars, making it among the most costly hurricanes to strike the U.S. mainland. It is estimated that, if an identical hurricane had struck in 2005, it would have caused $39.2 billion in damage due to changes in population and infrastructure.

Approximately 600 people died in the storm in New England, most in Rhode Island, and up to 100 people elsewhere in the path of the storm. An additional 708 people were reported injured.

In total, 4,500 cottages, farms, and other homes were reported destroyed and 25,000 homes were damaged. Other damages included 26,000 automobiles destroyed and 20,000 electrical poles toppled. The hurricane also devastated the forests of the Northeast, knocking down an estimated two billion trees in New York and New England. Freshwater flooding was minimal, however, as the quick passage of the storm decreased local rainfall totals, with only a few small areas receiving over 10 inches (250 mm).

Over 35% of New England's total forest area was affected. In all, over 2.7 billion board feet of trees fell because of the storm, although 1.6 billion board feet of the trees were salvaged. The Northeastern Timber Salvage Administration (NETSA) was established to deal with the extreme fire hazard that the fallen timber had created. In many locations, roads from the fallen tree removal were visible decades later, and some became trails still used today. The New York, New Haven and Hartford Railroad from New Haven to Providence was particularly hard hit, as countless bridges along the Shore Line were destroyed or flooded, severing rail connections to badly affected cities such as Westerly, Rhode Island.

Due to the lack of technology back in 1938, Long Island residents were not warned of the hurricane's arrival,leaving no time to prepare or evacuate. Long Island was struck first, before New England and Quebec, earning the storm the nickname the "Long Island Express." The winds reached up to 150 mph with waves surging to around 25–35 feet high.

Yale and Harvard both owned large forests managed by their forestry departments, but both forests were wiped out by the hurricane. However, Yale had a backup forest at Great Mountain in northwestern Connecticut which was spared from the totality of the damages, and they were able to keep their forestry program running, which maintains operation today. Harvard's program, however, was reduced as a result.

The storm surge hit Westerly, Rhode Island at 3:50 pm, resulting in 100 deaths. The tide was higher than usual because of the autumnal equinox and full moon, and the hurricane produced storm tides of 14 to 18 feet (5 m) along most of the Connecticut coast, with 18 to 25-foot (8 m) tides from New London east to Cape Cod—including the entire coastline of Rhode Island.

The storm surge was especially violent along the Rhode Island shore, sweeping hundreds of summer cottages out to sea. Low-lying Block Island was almost completely underwater, and many drowned. As the surge drove northward through Narragansett Bay, it was restricted by the Bay's funnel shape and rose to 15.8 feet above normal spring tides, resulting in more than 13 feet (4.0 m) of water in some areas of downtown Providence. Several motorists were drowned in their automobiles. In Jamestown, seven children were killed when their school bus was blown into Mackerel Cove. Many stores in downtown Providence were looted by mobs, often before the flood waters had fully subsided and due in part to the economic difficulties of the Great Depression.

Many homes and structures were destroyed along the coast, as well as many structures inland along the hurricane's path. Entire beach communities were obliterated on the coast of Rhode Island. Napatree Point was completely swept away, a small cape that housed nearly 40 families between the Atlantic Ocean and Little Narragansett Bay just off of Watch Hill. Today, Napatree is a wildlife refuge with no human inhabitants. One house in Charlestown, Rhode Island was lifted and deposited across the street, where it stood until it was demolished in August 2011. Even to this day, concrete staircases and boardwalk bases destroyed by the hurricane can be found when sand levels are low on some beaches. The boardwalk along Easton's Beach in Newport was completely destroyed by the storm.

A few miles from Conanicut Island, Whale Rock Light was swept off its base and into the raging waves, killing lighthouse keeper Walter Eberle. His body was never found. The Prudence Island Light suffered a direct blow from the storm surge, which measured 17 feet 5 inches at Sandy Point. The masonry tower was slightly damaged. However, the adjoining light keeper's home was utterly destroyed and washed out to sea. The light keeper's wife and son were both killed, as well as the former light keeper and a couple who left their summer cottages near the lighthouse and sought shelter in what they thought was the sturdier light keeper's home. Light keeper George T. Gustavus was thrown free from the wreckage of the house and was saved by an island resident who held a branch into the water from the cliffs farther down the coast. Gustavus and Milton Chase, the owner of the island's power plant, reactivated the light during the storm by running a cable from the plant to the light and installing a light bulb, marking the first time that it was illuminated with electricity.

The original parchment of the 1764 Charter of Brown University was washed clean of its text when its vault was flooded in a Providence bank. Newport recorded the highest water level of the storm, at 3.53 meters above mean sea level according to a NOAA study.This storm level is 0.98 meters above the SLOSH model of a 100-year storm, and one estimate is that this water level "reflects a storm occurring roughly once every 400 years." A study of sand deposits also gives evidence that this was the strongest hurricane to hit Rhode Island in over 300 years. The Fox Point Hurricane Barrier was completed in 1966 because of the massive flooding from the 1938 storm, and from the even higher 14.4 foot (4.4 meters) storm surge that resulted from 1954's Hurricane Carol, in hopes of preventing extreme storm surges from ever again flooding downtown Providence




Eastern Connecticut was on the eastern side of the hurricane. Long Island acted as a buffer against large ocean surges, but the waters of Long Island Sound rose to great heights. Small shoreline towns to the east of New Haven experienced much destruction from the water and winds, and the 1938 hurricane holds the record for the worst natural disaster in Connecticut's 350-year history. The mean low-water storm tide was 14.1 feet at Stamford, 12.8 feet at Bridgeport, and 10.58 feet at New London, which remains a record high.

In the shoreline towns of Madison, Clinton, Westbrook, and Old Saybrook, buildings were found as wreckage across coastal roads. Actress Katharine Hepburn waded to safety from her Old Saybrook beach home, narrowly escaping death. She stated in her 1991 book that 95% of her personal belongings were either lost or destroyed, including her 1932 Oscar, which was later found intact. In Old Lyme, beach cottages were flattened or swept away. The NY NH&H passenger train Bostonian became stuck in debris at Stonington. Two passengers drowned while attempting to escape before the crew was able to clear the debris and get the train moving. Along the Stonington shorefront, buildings were swept off their foundations and found two miles (3 km) inland. Rescuers found live fish and crabs in kitchen drawers and cabinets while searching for survivors in the homes in Mystic.

New London was first swept by the winds and storm surge, after which the waterfront business district caught fire and burned out of control for 10 hours. Stately homes along Ocean Beach were leveled by the storm surge. The permanently anchored 240-ton lightship at the head of New London Harbor was found on a sand bar two miles (3 km) away.

Interior sections of the state experienced widespread flooding as the hurricane's torrential rains fell on soil already saturated from previous storms. The Connecticut River was forced out of its banks, inundating cities and towns from Hartford to Middletown. Novelist Ann Petry drew on her personal experiences of the hurricane in Old Saybrook in her 1947 novel Country Place. The novel is set in the immediate aftermath of World War II, but Petry identified the 1938 storm as the source for the storm that is at the center of her narrative