Archive for the ‘Science’ Category

A multiverse model of the Universe from 800 years ago

May 6, 2014

Bishop Robert Grosseteste, detail of a window on the South transept Westernmost. St Paul’s Parish Church, Morton, Near Gainsborough. 1896

Robert Grosseteste (ca. 1168–1253), Bishop of Lincoln from 1235 to 1253, was one of the most prominent and remarkable figures in thirteenth-century English intellectual life. His views on light and matter were some 800 years in advance of his time.

If a single leitmotif runs through Grosseteste’s works, it is that of light. The notion of light occupies a prominent place in Grosseteste’s commentaries on the Bible, in his account of sense perception and the relation of body and soul, in his illuminationist theory of knowledge, in his account of the origin and nature of the physical world, and, of course, in his writings on optics. 

 

His treatise De Luce (meaning “Concerning Light”), written in 1225, describes a Universe created via a Big Bang-like explosion of light before forming into a series of nine celestial spheres.

Past Horizons:

The Ordered Universe Project, which brings together physicists, psychologists, cosmologists, Latin experts and medieval historians, has been studying the texts of Robert Grosseteste, one-time Bishop of Lincoln. The team created a fresh Latin translation, aided by other experts with knowledge of the medieval mindset and its context, before applying modern mathematical and computational techniques to Grosseteste’s equations. ….

Dr Giles Gasper, the Ordered Universe Project’s Principal Investigator and Associate Director of Durham University’s Institute of Medieval and Early Modern Studies, said: “De Luce is the earliest known attempt to describe the Universe using a coherent set of physical laws, centuries before Sir Isaac Newton.

“It proposes that the same physics of light and matter, which explain the solidity of ordinary objects, could be applied to the cosmos as a whole. In doing so it also suggests, although this was probably not apparent to Grosseteste at the time, a series of ordered universes reminiscent of the modern “multiverse” concept.

“Grosseteste’s calculations are very consistent and precise. Had he had access to modern calculus and computing methods, he surely would have used them, so that is what the team has done.”

Richard G. Bower, Tom C. B. McLeish, Brian K. Tanner, Hannah E. Smithson , Cecilia Panti, Neil Lewis, Giles E. M. Gasper, A Medieval Multiverse: Mathematical Modelling of the 13th Century Universe of Robert Grosseteste, Royal Society Journal, Proceedings of the Royal Society A, arXiv:1403.0769

Abstract: In his treatise on light, written in about 1225, Robert Grosseteste describes a cosmological model in which the Universe is created in a big-bang like explosion and subsequent condensation. He postulates that the fundamental coupling of light and matter gives rises to the material body of the entire cosmos. Expansion is arrested when matter reaches a minimum density and subsequent emission of light from the outer region leads to compression and rarefaction of the inner bodily mass so as to create nine celestial spheres, with an imperfect residual core. In this paper we reformulate the Latin description in terms of a modern mathematical model. The equations which describe the coupling of light and matter are solved numerically, subject to initial conditions and critical criteria consistent with the text. Formation of a universe with a non-infinite number of perfected spheres is extremely sensitive to the initial conditions, the intensity of the light and the transparency of these spheres. In this “medieval multiverse”, only a small range of opacity and initial density profiles lead to a stable universe with nine perfected spheres. As in current cosmological thinking, the existence of Grosseteste’s universe relies on a very special combination of fundamental parameters.

God is an hypothesis and a mathematician is a linguist

May 6, 2014

Science discovers, engineering invents.

Eyes are to vision as language is to discovery.

To be discovered it must first be imaginable.

To describe and communicate what can be imagined needs language.

To be “discovered” requires that something imagined in a language be “sensed” (observed or measured or calculated or inferred).

Something imagined to exist but not yet discovered is a faith – an hypothesis.

Without the attribute of hearing, there is no sound.

Discoveries need a suitable language to first describe them before they can be found (Mathematics, Chemistry, Algebra, Logic….).

Language is an invention and can not be discovered.

The application of discovered science to the manufacturing of artefacts is engineering.

Mathematics is a language and a mathematician is a linguist (an engineer).

Logic is a language and a logician is a philosopher.

Philosophers imagine and describe but neither discover nor invent.

Music is a science and a musician is a scientist.

Painting (or sculpture) is engineering and the artist is an engineer.

Medicine is a science but a practising physician is an engineer.

The symbol for a thing is not the thing.

God is an hypothesis and a mathematician is a linguist.

 

Idiot science: Babies cry at night to prevent Mom from having another child!!

April 28, 2014
David Haig

David Haig

Some so-called “science” is done primarily for headlines – even at Harvard. I wonder if there is a correlation between headlines generated and funding received?

This time the idiot science is from David Haig – a biology Professor at Harvard. His abstract states

All these observations are consistent with a hypothesis that waking at night to suckle is an adaptation of infants to extend their mothers’ lactational amenorrhea, thus delaying the birth of a younger sib and enhancing infant survival.

From Science News:

When a baby cries at night, exhausted parents scramble to figure out why. He’s hungry. Wet. Cold. Lonely. But now, a Harvard scientist offers more sinister explanation: The baby who demands to be breastfed in the middle of the night is preventing his mom from getting pregnant again.

This devious intention makes perfect sense, says evolutionary biologist David Haig, who describes his idea in Evolution, Medicine and Public Health. Another baby means having to share mom and dad, so babies are programmed to do all they can to thwart the meeting of sperm and egg, the theory goes.

Since babies can’t force birth control pills on their mothers, they work with what they’ve got: Nighttime nursing liaisons keep women from other sorts of liaisons that might lead to another child. And beyond libido-killing interruptions and extreme fatigue, frequent night nursing also delays fertility in nursing women. Infant suckling can lead to hormone changes that put the kibosh on ovulation (though not reliably enough to be a fail-safe birth control method, as many gynecologists caution).

Of course, babies don’t have the wherewithal to be interrupting their mothers’ fertility intentionally. It’s just that in our past, babies who cried to be nursed at night had a survival edge, Haig proposes.

The timing of night crying seems particularly damning, Haig says. Breastfed babies seem to ramp up their nighttime demands around 6 months of age and then slowly improve — precisely the time when a baby would want to double down on its birth control efforts. … 

Tenured Professors would seem to have little need for common sense.

What is worse than the idiot science is the fawning article by Laura Sanders in Science News.

More stem cell fakery as a quick way to publication and fame?

March 11, 2014

Dr Haruko Obokata shot to fame with her stem cell papers photo BBC

Another young researcher, Dr Haruko Obokata has apparently made sensational claims about her stem cell research, shot to fame as lead author in two papers published in Nature and is now in the dock for dodgy images and irreproducible results (perhaps faked).

WSJ: Her co-author, Teruhiko Wakayama of Yamanashi University in Japan, called Monday for the retraction of the findings, published in late January in a pair of papers in the journal Nature.

The papers drew international attention because they held out a safer, easier and more ethical technique for creating master stem cells. These cells, which can be turned into all other body tissues, promise one day to transform the treatment of various ailments, from heart disease to Alzheimer’s. 

But shortly after the papers appeared, Japan’s Riken Center for Developmental Biology, where the work took place, began to investigate alleged irregularities in images used in the papers. Separately, many labs said they couldn’t replicate the results.

A spokesman for Riken said Tuesday that the institution was considering a retraction and that the article’s authors were discussing what to do.

Dr. Wakayama said he has asked the lead author, Haruko Obokata, to retract the studies. “There is no more credibility when there are such crucial mistakes,” he said in an email to The Wall Street Journal.

Dr. Wakayama said he learned Sunday that an image used in Dr. Obokata’s 2011 doctoral thesis had also been used in the Nature papers. “It’s unlikely that it was a careless mistake since it’s from a different experiment from a different time,” he said.

Like several other researchers, Dr. Wakayama said he hasn’t yet been able to reproduce the results. “There is no value in it if the technique cannot be replicated,” he said. 

But another co-author of the papers, Charles Vacanti, a tissue engineer at Harvard Medical School and Brigham and Women’s Hospital in Boston, defended the work. “Some mistakes were made, but they don’t affect the conclusions,” he said in an interview Monday. “Based on the information I have, I see no reason why these papers should be retracted.”

Dr. Vacanti—whose early work some 15 years ago spurred the novel experiments—said he was surprised to hear that one of his co-authors asked for the retraction.

Dr. Vacanti said he had spoken to Dr. Obokata on Monday and that she also stood by the research. “It would be very sad to have such an important paper retracted as a result of peer pressure, when indeed the data and conclusions are honest and valid,” said Dr. Vacanti. …..

The papers created a stir because they reported a process by which mouse cells could be returned to an embryonic-like state simply by dipping them in a mild acid solution, creating what they called STAP cells, for stimulus-triggered acquisition of pluripotency. ….

There seems to be a hint of some “academic rivalry” here as well.

Retraction Watch has more:

Nature told the WSJ that it was still investigating the matter. As Nature‘s news section reported last month, lead author

…biologist Haruko Obokata, who is based at the institution…shot to fame as the lead author of two papers12 published in Nature last month that demonstrated a way to reprogram mature mouse cells into an embryonic state by simply applying stress, such as exposure to acid conditions or physical pressure on cell membranes.

But the studies, published online on January 29, soon came under fire. Paul Knoepfler has had a number of detailed posts on the matter, as hasPubPeer.

Stem cell research seems to have more than its fair share of dodgy papers – presumably because sensational results are easier to come by and very much easier to get published.

The sun dances to its own tune with unexpectedly high activity in February

March 4, 2014

There is more we don’t know that we don’t know that we don’t know. (with apologies to Donald Rumsfeld).

While Solar Cycle 24 is still showing the lowest sunspot activity in 100 years, its activity during February was unexpectedly high.  And we don’t really know why.

SC24 Feb 2014 graphic NOAA data from informthpundits

SC24 Feb 2014 graphic NOAA data from informthepundits

The increased activity in February was certainly not expected but SC24 still remains at a very low activity level compared to recent Solar Cycles.

SC21 - SC24 from solen.info

SC21 – SC24 from solen.info

It is a little too early to see if the similarities with Solar Cycles 5 and 6 (during the time of the Dalton Minimum) will hold up and lead us into another Little Ice Age. Or whether this Landscheidt Minimum will just lead to a 30 year cooling period without quite producing a Little Ice Age.

Interesting times.

The Sun will continue to dance to its own tune and we will make believe that we know what it is doing.

But we ignore the Sun at our peril.

“Peer review is a regression to the mean. ….. a completely corrupt system” – Sydney Brenner

March 2, 2014
Sydney Brenner

Sydney Brenner, CH FRS (born 13 January 1927) is a biologist and a 2002 Nobel prize laureate, shared with H. Robert Horvitz and John Sulston. Brenner made significant contributions to work on the genetic code, and other areas of molecular biology at the Medical Research Council Unit in Cambridge, England.

A fascinating interview with Professor Sydney Brenner by Elizabeth Dzeng in the Kings Review.

I find his comments on Academia and publishing and peer review  particularly apposite. Peer review – especially where cliques of “peers” determine “correct thinking” – can not provide sufficient room for the dissenting view, for the challenging of orthodoxy. Orthodox but incorrect views thus persist for much longer than they should. Completely new avenues are effectively blocked and ideas are still-born.

Some extracts here but the whole conversation is well worth a read.

How Academia and Publishing are Destroying Scientific Innovation: A Conversation with Sydney Brenner

by Elizabeth Dzeng, February 24th

I recently had the privilege of speaking with Professor Sydney Brenner, a professor of Genetic medicine at the University of Cambridge and Nobel Laureate in Physiology or Medicine in 2002. ….

SB: Today the Americans have developed a new culture in science based on the slavery of graduate students. Now graduate students of American institutions are afraid. He just performs. He’s got to perform. The post-doc is an indentured labourer. We now have labs that don’t work in the same way as the early labs where people were independent, where they could have their own ideas and could pursue them.

The most important thing today is for young people to take responsibility, to actually know how to formulate an idea and how to work on it. Not to buy into the so-called apprenticeship. I think you can only foster that by having sort of deviant studies. ……..

…… I think I’ve often divided people into two classes: Catholics and Methodists. Catholics are people who sit on committees and devise huge schemes in order to try to change things, but nothing’s happened. Nothing happens because the committee is a regression to the mean, and the mean is mediocre. Now what you’ve got to do is good works in your own parish. That’s a Methodist. 

ED: …….. It is alarming that so many Nobel Prize recipients have lamented that they would never have survived this current academic environment. What is the implication of this on the discovery of future scientific paradigm shifts and scientific inquiry in general? I asked Professor Brenner to elaborate.

SB: He wouldn’t have survived. It is just the fact that he wouldn’t get a grant today because somebody on the committee would say, oh those were very interesting experiments, but they’ve never been repeated. And then someone else would say, yes and he did it a long time ago, what’s he done recently?  And a third would say, to top it all, he published it all in an un-refereed journal.

So you know we now have these performance criteria, which I think are just ridiculous in many ways. But of course this money has to be apportioned, and our administrators love having numbers like impact factors or scores. ….

……. And of course all the academics say we’ve got to have peer review. But I don’t believe in peer review because I think it’s very distorted and as I’ve said, it’s simply a regression to the mean.

I think peer review is hindering science. In fact, I think it has become a completely corrupt system. It’s corrupt in many ways, in that scientists and academics have handed over to the editors of these journals the ability to make judgment on science and scientists. There are universities in America, and I’ve heard from many committees, that we won’t consider people’s publications in low impact factor journals.

Now I mean, people are trying to do something, but I think it’s not publish or perish, it’s publish in the okay places [or perish]. And this has assembled a most ridiculous group of people.

…….. I think there was a time, and I’m trying to trace the history when the rights to publish, the copyright, was owned jointly by the authors and the journal. Somehow that’s why the journals insist they will not publish your paper unless you sign that copyright over. It is never stated in the invitation, but that’s what you sell in order to publish. And everybody works for these journals for nothing. There’s no compensation. There’s nothing. They get everything free. They just have to employ a lot of failed scientists, editors who are just like the people at Homeland Security, little power grabbers in their own sphere.

If you send a PDF of your own paper to a friend, then you are committing an infringement. Of course they can’t police it, and many of my colleagues just slap all their papers online. I think you’re only allowed to make a few copies for your own purposes. It seems to me to be absolutely criminal. When I write for these papers, I don’t give them the copyright. I keep it myself. That’s another point of publishing, don’t sign any copyright agreement. That’s my advice. I think it’s now become such a giant operation. I think it is impossible to try to get control over it back again. …….. Recently there has been an open access movement and it’s beginning to change. I think that even NatureScience and Cell are going to have to begin to bow. I mean in America we’ve got old George Bush who made an executive order that everybody in America is entitled to read anything printed with federal funds, tax payers’ money, so they have to allow access to this. But they don’t allow you access to the published paper. They allow you I think what looks like a proof, which you can then display.

Elizabeth Dzeng is a PhD candidate conducting research at the intersection of medical sociology, clinical medicine and medical ethics at the University of Cambridge. She is also a practising doctor and a fellow in General Internal Medicine at the Johns Hopkins School of Medicine.

Orbiting charges

February 27, 2014

From TEX

enter image description here

Electric field due to 3 charges. The black one is a negative charge orbiting the other two positive charges.

Number of citations and excellence in science

February 10, 2014

Scientific excellence can only truly be judged by history. But history has eyes only for impact and if excellent science causes no great change to science orthodoxy, it is soon forgotten. For a scientist the judgements of history long after he performs his science are of no real significance. Even where academic freedom is the main motivator for the scientist,  the degrees of freedom available are related to academic success. An academic or scientific career depends increasingly on contemporaneus judgements – and here social networking, peer review and bibliometric factors are decisive. There may well be some correlation between academic success and the “goodness” of the scientist but it is not the success or the bibliometrics which are causative.

As Lars Walloe puts it: Walloe-on-Exellence

In the evaluation process many scientists and nearly all university and research council administrators love all kind of bibliometric tools. This has of course a simple explanation. The “bureaucracy” likes to have a simple quantitative tool, which can be used with the aid of a computer and the internet to give an “objective” measure of excellence. However, excellence is not directly related either to the impact factor of the journal in which the work is published, or to the number of citations, or to the number of papers published, or even to some other more sophisticated bibliometric indices. Of course there is some correlation, but it is in my judgement weaker than what many would like to believe, and uncritical use of these tools easily leads to wrong conclusions. For instance the impact factor of a journal is mainly determined by the very best papers published in it and not so much by the many ordinary papers published. We know well that even in high impact factor journals like Science and Nature or high impact journals in more specialized fields, from time to time not so excellent papers are being published. 

…..  I often meet scientists for whom to obtain high bibliometric factors serve as a prime guidance in their work. Too many of them are really not that good, but were just lucky or work in a field where it was easier to get many citations. …..If you are working with established methods in a popular field you can be fairly sure to get your papers published. I can mention in details some medical fields were I know that this has happened or is happening today. The scientists in such fields get a high number of publications and citations, but the research is not necessarily excellent. 

And getting your paper published has now become so important in the advancement of an academic career that journals are proliferating. Many of the new journals have now shifted their business models to be based on author’s fees and not on volume of readership. This is a very “safe” business model since profits are ensured before the journal has even been published and if the journal is an on-line journal then costs are minimal. It is virtually the “self-publishing” of papers. You pay your money and get your paper published.

The reality today is that more papers are being published by more authors in more journals than ever before. But fewer are being actually read. Papers are cited without having been read – let alone understood.

Skeptical Scalpel:

Another reason could be that publishers, particularly those who charge authors fees for publishing, are in the business of making money.

Authoring journal articles is not only enhancing to one’s CV (the old “publish or perish” cliché), it is required by Residency Review Committees as evidence of “scholarly activity” in training programs. Maybe it’s good for attracting referrals too.

The publish or perish ethos has led to a proliferation of the number of authors per paper!

First noted in 1993 by a paper in Acta Radiologica and a letter in the BMJ, the number of authors per paper has risen dramatically over the years. 
study of 12 radiology journals found the number of authors per paper doubled from 2.2 in 1966 to 4.4 in 1991.  A review of Neurosurgery and the Journal of Neurosurgery spanned 50 years. the average went from 1.8 authors per article in 1945 to 4.6 authors in 1995. 
Of note, the above two articles were each written by a single author. 
Three psychiatrists from Dartmouth analyzed original scientific articles in four of the most prestigious journals in the United States—Archives of Internal Medicine, Annals of Internal Medicine, Journal of the American Medical Association, and the New England Journal of Medicine—from 1980 to 2000. They found that the mean number of authors per paper increased from 4.5 to 6.9. The same is true for two plastic surgery journals, which saw the average number of authors go from 1.4 to 4.0 and 1.7 to 4.2 in the 50 years from 1955 to 2005. The number of single-author papers went from 78% to 3% in one journal and 51% to 8% another.
In orthopedics, a 
review of the American and British versions of the Journal of Bone and Joint Surgery for 60 years from 1949 to 2009 showed an increase of authors per paper from 1.6 to 5.1.
An impressive  rise in the number of authors took place in two leading thoracic surgery
 journals. For the Journal of Thoracic and Cardiovascular Surgery the increase was 1.4  in 1936 to 7.5 2006 and for Annals of Thoracic Surgery it was 3.1 in 1966 to 6.8 in 2006. 

And the winner is a paper with 3171 authors! Needles to say it comes from Big Science and the Large Hadron Collider:

the paper with the most authors is “Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC” in a journal called “Physics Letters B” with 3171. The list of authors takes up 9 full pages.

Too many journals, too many papers, too many authors and too many citations. But that does not mean there is more excellence in science.

Academic backstabbing, misconduct, conspiracy and much, much more at Purdue University

February 7, 2014

The Purdue University School of Nuclear Engineering is the unlikely location for a tangled, sordid tale which I cannot make much sense of.

Professor Rusi Taleyarkhan is either a somewhat naive victim of a nasty conspiracy or he is guilty of academic misconduct and has received his just deserts. But his primary anatgonist was Professor Lefteri Tsoukalas once the head of the School of Nuclear Engineering but who was forced to resign as head by Purdue. Purdue removed Taleyarkhan’s endowed professorship, reduced his salary, and limited his duties with students. There is a murky connection between a journalist Eugenie Reich and Tsoukalas while Reich was promoting her book about scientific misconduct and there was some form of cooperation between them and a number of others to accuse Taleyarkhan.

Once I got this far I gave up.

There is quite obviously a great deal of muck in the Purdue University School of Nuclear Engineering. The University is probably vacillating between support for the warring academics. The role of  the journalist is what adds to the possibility of a nasty conspiracy.

It seems too tawdry to waste much time on though, of course, some careers are being destroyed and someone is – or both are – indulging in defamation.

The New Energy Times has a whole series of articles on the subject. They seem to feel that Talayarkhan has been badly wronged. This article in TwoCircles also takes that position. Tsoukalas puts his position in a letter provided to the New York Times (in 2007).

Oh what a tangled web they weave. It’s all about low-energy, table-top fusion — so it is all probably a hurricane in a thimble. I cannot help observing that cold fusion and claims of misconduct generally seem to go hand-in-hand!

From graphene to borophene

January 29, 2014

Technology development waves

The discovery of graphene is leading to a new excitement in materials research. I have a notion that technology advances take place in step waves, where each step is both enabled and constrained by the materials available. Each time a new material (or material family is discovered), technology development starts very fast and then tapers off until another material comes along and ignites a new development wave.

Boron is Carbon’s neighbour in the periodic table and the discovery of graphene has ignited studies to see if a similar variation of boron would be possible.

Boron is a Group 13 element that has properties which are borderline between metals and non-metals (semimetallic). It is a semiconductor rather than a metallic conductor. Chemically it is closer to silicon than to aluminium, gallium, indium, and thallium. Crystalline boron is inert chemically and is resistant to attack by boiling HF or HCl. When finely divided it is attacked slowly by hot concentrated nitric acid.

Boron, Symbol: B, Atomic number: 5, Atomic weight: 10.811, solid at 298 K

“Boron has one fewer electron than carbon and as a result can’t form the honeycomb lattice that makes up graphene. For boron to form a single-atom layer, theorists suggested that the atoms must be arranged in a triangular lattice with hexagonal vacancies — holes — in the lattice.”

A new paper shows that borophene is possible – now it just has to be made!

Zachary A. Piazza, Han-Shi Hu, Wei-Li Li, Ya-Fan Zhao, Jun Li, Lai-Sheng Wang.Planar hexagonal B36 as a potential basis for extended single-atom layer boron sheetsNature Communications, 2014; 5 DOI: 10.1038/ncomms4113

Brown University Press Release:

Unlocking the secrets of the B36 cluster
A 36-atom cluster of boron, left, arranged as a flat disc with a hexagonal hole in the middle, fits the theoretical requirements for making a one-atom-thick boron sheet, right, a theoretical nanomaterial dubbed “borophene.” Credit: Wang lab/Brown University

Graphene, a sheet of carbon one atom thick, may soon have a new nanomaterial partner. In the lab and on supercomputers, chemical engineers have determined that a unique arrangement of 36 boron atoms in a flat disc with a hexagonal hole in the middle may be the preferred building blocks for “borophene.”

Researchers from Brown University have shown experimentally that a boron-based competitor to graphene is a very real possibility.

Lai-Sheng Wang, professor of chemistry at Brown and his research group, which has studied boron chemistry for many years, have now produced the first experimental evidence that such a structure is possible. In a paper published on January 20 in Nature Communications, Wang and his team showed that a cluster made of 36 boron atoms (B36) forms a symmetrical, one-atom thick disc with a perfect hexagonal hole in the middle.

“It’s beautiful,” Wang said. “It has exact hexagonal symmetry with the hexagonal hole we were looking for. The hole is of real significance here. It suggests that this theoretical calculation about a boron planar structure might be right.”

It may be possible, Wang said, to use B36 basis to form an extended planar boron sheet. In other words, B36 may well be the embryo of a new nanomaterial that Wang and his team have dubbed “borophene.”

“We still only have one unit,” Wang said. “We haven’t made borophene yet, but this work suggests that this structure is more than just a calculation.” ……..

Wang’s experiments showed that the B36 cluster was something special. It had an extremely low electron binding energy compared to other boron clusters. The shape of the cluster’s binding spectrum also suggested that it was a symmetrical structure. ……..

…… That structure also fits the theoretical requirements for making borophene, which is an extremely interesting prospect, Wang said. The boron-boron bond is very strong, nearly as strong as the carbon-carbon bond. So borophene should be very strong. Its electrical properties may be even more interesting. Borophene is predicted to be fully metallic, whereas graphene is a semi-metal. That means borophene might end up being a better conductor than graphene.

“That is,” Wang cautions, “if anyone can make it.”

AbstractBoron is carbon’s neighbour in the periodic table and has similar valence orbitals. However, boron cannot form graphene-like structures with a honeycomb hexagonal framework because of its electron deficiency. Computational studies suggest that extended boron sheets with partially filled hexagonal holes are stable; however, there has been no experimental evidence for such atom-thin boron nanostructures. Here, we show experimentally and theoretically that B36 is a highly stable quasiplanar boron cluster with a central hexagonal hole, providing the first experimental evidence that single-atom layer boron sheets with hexagonal vacancies are potentially viable. Photoelectron spectroscopy of B36 reveals a relatively simple spectrum, suggesting a symmetric cluster. Global minimum searches for B36 lead to a quasiplanar structure with a central hexagonal hole. Neutral B36 is the smallest boron cluster to have sixfold symmetry and a perfect hexagonal vacancy, and it can be viewed as a potential basis for extended two-dimensional boron sheets.