Tag Archives: Harris & Harris Group

Google announces research results after testing 1,097-qubit D-Wave 2X™ quantum computers

If you’ve been reading this blog over the last few months, you’ll know that I’ve mentioned D-Wave Systems, a Vancouver (Canada)-based quantum computing company, frequently. The company seems to be signing all kinds of deals lately including one with Google (my Oct. 5, 2015 posting). Well, a Dec. 9, 2015 news item on Nanotechnology Now sheds more light on how Google is using D-Wave’s quantum computers,

Harris & Harris Group, Inc. (NASDAQ: TINY), an investor in transformative companies enabled by disruptive science, notes that yesterday [Dec. 8, 2015] NASA, Google and the Universities Space Research Association (USRA) hosted a tour of the jointly run Quantum Artificial Intelligence Laboratory located at the NASA’s Ames Research Center which houses one of D-Wave’s 1,097-qubit D-Wave 2X™ quantum computers. At this event, Google announced that D-Wave’s quantum computer was able to find solutions to complicated problems of nearly 1,000 variables up to 108 (100,000,000) times faster than classical computers.

A Dec. 8, 2015 posting by Hartmut Neven for the Google Research blog describes the research and the results (Note: Links have been removed),

During the last two years, the Google Quantum AI [artificial intelligence] team has made progress in understanding the physics governing quantum annealers. We recently applied these new insights to construct proof-of-principle optimization problems and programmed these into the D-Wave 2X quantum annealer that Google operates jointly with NASA. The problems were designed to demonstrate that quantum annealing can offer runtime advantages for hard optimization problems characterized by rugged energy landscapes. We found that for problem instances involving nearly 1000 binary variables, quantum annealing significantly outperforms its classical counterpart, simulated annealing. It is more than 108 times faster than simulated annealing running on a single core. We also compared the quantum hardware to another algorithm called Quantum Monte Carlo. This is a method designed to emulate the behavior of quantum systems, but it runs on conventional processors. While the scaling with size between these two methods is comparable, they are again separated by a large factor sometimes as high as 108.

For anyone (like me) who needs an explanation of quantum annealing, there’s this from its Wikipedia entry (Note: Links have been removed),

Quantum annealing (QA) is a metaheuristic for finding the global minimum of a given objective function over a given set of candidate solutions (candidate states), by a process using quantum fluctuations. Quantum annealing is used mainly for problems where the search space is discrete (combinatorial optimization problems) with many local minima; such as finding the ground state of a spin glass.[1] It was formulated in its present form by T. Kadowaki and H. Nishimori in “Quantum annealing in the transverse Ising model”[2] though a proposal in a different form had been proposed by A. B. Finilla, M. A. Gomez, C. Sebenik and J. D. Doll, in “Quantum annealing: A new method for minimizing multidimensional functions”.[3]

Not as helpful as I’d hoped but sometimes its necessary to learn a new vocabulary and a new set of basic principles, which takes time and requires the ability to ‘not know’ and/or ‘not understand’ until one day, you do.

In the meantime, here’s more possibly befuddling information from the researchers in the form of a paper on arXiv.org,

What is the Computational Value of Finite Range Tunneling? by Vasil S. Denchev, Sergio Boixo, Sergei V. Isakov, Nan Ding, Ryan Babbush, Vadim Smelyanskiy, John Martinis, Hartmut Neven. http://arxiv.org/abs/1512.02206

This paper is open access.

Lockheed Martin upgrades to 1000+ Qubit D-Wave system

D-Wave Systems, a Canadian quantum computing company, seems to be making new business announcements on a weekly basis. After last week’s US Los Alamos National Laboratory announcement (Nov. 12, 2015 posting) , there’s a Nov. 16, 2015 news item on Nanotechnology Now,

Harris & Harris Group, Inc. (NASDAQ:TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has entered into a multi-year agreement with Lockheed Martin to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits.

A Nov. 16, 2015 D-Wave Systems news release provides more details about the deal,

D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has entered into a multi-year agreement with Lockheed Martin (NYSE: LMT) to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits. This represents the second system upgrade since Lockheed Martin became D-Wave’s first customer in 2011 with the purchase of a 128 qubit D-Wave One™ system. The agreement includes the system, maintenance and associated professional services.

“Our mission is to solve complex challenges, advance scientific discovery and deliver innovative solutions to our customers, which requires expertise in the most advanced technologies,” said Greg Tallant, Lockheed Martin fellow and lead for the University of Southern California-Lockheed Martin Quantum Computation Center (QCC). “Through our continued investment in D-Wave technology, we are able to push the boundaries of quantum computing and apply the latest technologies to address the real-world problems being faced by our customers.”

For quantum computing, the performance gain over traditional computing is most evident in exceedingly complex computational problems. This could be in areas such as validating the performance of software or vehicle planning and scheduling. With the new D-Wave system, Lockheed Martin researchers will be able to explore solutions for significantly larger computational problems with improved accuracy and execution time.

The new system will be hosted at the University of Southern California-Lockheed Martin Quantum Computation Center, which first began exploring the power of quantum computing with the D-Wave One, the world’s first quantum computer.

The installation of the D-Wave 2X system will be completed in January 2016.

Who knows what next week will bring for D-Wave, which by the way is located in Vancouver, Canada or, more accurately, Burnaby?

D-Wave passes 1000-qubit barrier

A local (Vancouver, Canada-based, quantum computing company, D-Wave is making quite a splash lately due to a technical breakthrough.  h/t’s Speaking up for Canadian Science for Business in Vancouver article and Nanotechnology Now for Harris & Harris Group press release and Economist article.

A June 22, 2015 article by Tyler Orton for Business in Vancouver describes D-Wave’s latest technical breakthrough,

“This updated processor will allow significantly more complex computational problems to be solved than ever before,” Jeremy Hilton, D-Wave’s vice-president of processor development, wrote in a June 22 [2015] blog entry.

Regular computers use two bits – ones and zeroes – to make calculations, while quantum computers rely on qubits.

Qubits possess a “superposition” that allow it to be one and zero at the same time, meaning it can calculate all possible values in a single operation.

But the algorithm for a full-scale quantum computer requires 8,000 qubits.

A June 23, 2015 Harris & Harris Group press release adds more information about the breakthrough,

Harris & Harris Group, Inc. (Nasdaq: TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has successfully fabricated 1,000 qubit processors that power its quantum computers.  D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.”  Every additional qubit doubles the search space of the processor.  At 1,000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which is substantially larger than the 2512 possibilities available to the company’s currently available 512 qubit D-Wave Two. In fact, the new search space contains far more possibilities than there are particles in the observable universe.

A June 22, 2015 D-Wave news release, which originated the technical details about the breakthrough found in the Harris & Harris press release, provides more information along with some marketing hype (hyperbole), Note: Links have been removed,

As the only manufacturer of scalable quantum processors, D-Wave breaks new ground with every succeeding generation it develops. The new processors, comprising over 128,000 Josephson tunnel junctions, are believed to be the most complex superconductor integrated circuits ever successfully yielded. They are fabricated in part at D-Wave’s facilities in Palo Alto, CA and at Cypress Semiconductor’s wafer foundry located in Bloomington, Minnesota.

“Temperature, noise, and precision all play a profound role in how well quantum processors solve problems.  Beyond scaling up the technology by doubling the number of qubits, we also achieved key technology advances prioritized around their impact on performance,” said Jeremy Hilton, D-Wave vice president, processor development. “We expect to release benchmarking data that demonstrate new levels of performance later this year.”

The 1000-qubit milestone is the result of intensive research and development by D-Wave and reflects a triumph over a variety of design challenges aimed at enhancing performance and boosting solution quality. Beyond the much larger number of qubits, other significant innovations include:

  •  Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.​
  • Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.
  • Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.
  • Advanced Fabrication:  The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.
  • New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources.  In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.

“Breaking the 1000 qubit barrier marks the culmination of years of research and development by our scientists, engineers and manufacturing team,” said D-Wave CEO Vern Brownell. “It is a critical step toward bringing the promise of quantum computing to bear on some of the most challenging technical, commercial, scientific, and national defense problems that organizations face.”

A June 20, 2015 article in The Economist notes there is now commercial interest as it provides good introductory information about quantum computing. The article includes an analysis of various research efforts in Canada (they mention D-Wave), the US, and the UK. These excerpts don’t do justice to the article but will hopefully whet your appetite or provide an overview for anyone with limited time,

A COMPUTER proceeds one step at a time. At any particular moment, each of its bits—the binary digits it adds and subtracts to arrive at its conclusions—has a single, definite value: zero or one. At that moment the machine is in just one state, a particular mixture of zeros and ones. It can therefore perform only one calculation next. This puts a limit on its power. To increase that power, you have to make it work faster.

But bits do not exist in the abstract. Each depends for its reality on the physical state of part of the computer’s processor or memory. And physical states, at the quantum level, are not as clear-cut as classical physics pretends. That leaves engineers a bit of wriggle room. By exploiting certain quantum effects they can create bits, known as qubits, that do not have a definite value, thus overcoming classical computing’s limits.

… The biggest question is what the qubits themselves should be made from.

A qubit needs a physical system with two opposite quantum states, such as the direction of spin of an electron orbiting an atomic nucleus. Several things which can do the job exist, and each has its fans. Some suggest nitrogen atoms trapped in the crystal lattices of diamonds. Calcium ions held in the grip of magnetic fields are another favourite. So are the photons of which light is composed (in this case the qubit would be stored in the plane of polarisation). And quasiparticles, which are vibrations in matter that behave like real subatomic particles, also have a following.

The leading candidate at the moment, though, is to use a superconductor in which the qubit is either the direction of a circulating current, or the presence or absence of an electric charge. Both Google and IBM are banking on this approach. It has the advantage that superconducting qubits can be arranged on semiconductor chips of the sort used in existing computers. That, the two firms think, should make them easier to commercialise.

Google is also collaborating with D-Wave of Vancouver, Canada, which sells what it calls quantum annealers. The field’s practitioners took much convincing that these devices really do exploit the quantum advantage, and in any case they are limited to a narrower set of problems—such as searching for images similar to a reference image. But such searches are just the type of application of interest to Google. In 2013, in collaboration with NASA and USRA, a research consortium, the firm bought a D-Wave machine in order to put it through its paces. Hartmut Neven, director of engineering at Google Research, is guarded about what his team has found, but he believes D-Wave’s approach is best suited to calculations involving fewer qubits, while Dr Martinis and his colleagues build devices with more.

It’s not clear to me if the writers at The Economist were aware of  D-Wave’s latest breakthrough at the time of writing but I think not. In any event, they (The Economist writers) have included a provocative tidbit about quantum encryption,

Documents released by Edward Snowden, a whistleblower, revealed that the Penetrating Hard Targets programme of America’s National Security Agency was actively researching “if, and how, a cryptologically useful quantum computer can be built”. In May IARPA [Intellligence Advanced Research Projects Agency], the American government’s intelligence-research arm, issued a call for partners in its Logical Qubits programme, to make robust, error-free qubits. In April, meanwhile, Tanja Lange and Daniel Bernstein of Eindhoven University of Technology, in the Netherlands, announced PQCRYPTO, a programme to advance and standardise “post-quantum cryptography”. They are concerned that encrypted communications captured now could be subjected to quantum cracking in the future. That means strong pre-emptive encryption is needed immediately.

I encourage you to read the Economist article.

Two final comments. (1) The latest piece, prior to this one, about D-Wave was in a Feb. 6, 2015 posting about then new investment into the company. (2) A Canadian effort in the field of quantum cryptography was mentioned in a May 11, 2015 posting (scroll down about 50% of the way) featuring a profile of Raymond Laflamme, at the University of Waterloo’s Institute of Quantum Computing in the context of an announcement about science media initiative Research2Reality.

More investment money for Canada’s D-Wave Systems (quantum computing)

A Feb. 2, 2015 news item on Nanotechnology Now features D-Wave Systems (located in the Vancouver region, Canada) and its recent funding bonanza of $28M dollars,

Harris & Harris Group, Inc. (Nasdaq:TINY), an investor in transformative companies enabled by disruptive science, notes the announcement by portfolio company, D-Wave Systems, Inc., that it has closed $29 million (CAD) in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million (CAD) raised in 2014. Harris & Harris Group’s total investment in D-Wave is approximately $5.8 million (USD). D-Wave’s announcement also includes highlights of 2014, a year of strong growth and advancement for D-Wave.

A Jan. 29, 2015 D-Wave news release provides more details about the new investment and D-Wave’s 2014 triumphs,

D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has closed $29 million in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million raised in 2014.

“The investment is a testament to the progress D-Wave continues to make as the leader in quantum computing systems,” said Vern Brownell, CEO of D-Wave. “The funding we received in 2014 will advance our quantum hardware and software development, as well as our work on leading edge applications of our systems. By making quantum computing available to more organizations, we’re driving our goal of finding solutions to the most complex optimization and machine learning applications in national defense, computing, research and finance.”

The funding follows a year of strong growth and advancement for D-Wave. Highlights include:

•    Significant progress made towards the release of the next D-Wave quantum system featuring a 1000 qubit processor, which is currently undergoing testing in D-Wave’s labs.
•    The company’s patent portfolio grew to over 150 issued patents worldwide, with 11 new U.S. patents being granted in 2014, covering aspects of D-Wave’s processor technology, systems and techniques for solving computational problems using D-Wave’s technology.
•    D-Wave Professional Services launched, providing quantum computing experts to collaborate directly with customers, and deliver training classes on the usage and programming of the D-Wave system to a number of national laboratories, businesses and universities.
•    Partnerships were established with DNA-SEQ and 1QBit, companies that are developing quantum software applications in the spheres of medicine and finance, respectively.
•    Research throughout the year continued to validate D-Wave’s work, including a study showing further evidence of quantum entanglement by D-Wave and USC  [University of Southern California] scientists, published in Physical Review X this past May.

Since 2011, some of the most prestigious organizations in the world, including Lockheed Martin, NASA, Google, USC and the Universities Space Research Association (USRA), have partnered with D-Wave to use their quantum computing systems. In 2015, these partners will continue to work with the D-Wave computer, conducting pioneering research in machine learning, optimization, and space exploration.

D-Wave, which already employs over 120 people, plans to expand hiring with the additional funding. Key areas of growth include research, processor and systems development and software engineering.

Harris & Harris Group offers a description of D-Wave which mentions nanotechnology and hosts a couple of explanatory videos,

D-Wave Systems develops an adiabatic quantum computer (QC).

Status
Privately Held

The Market
Electronics – High Performance Computing

The Problem
Traditional or “classical computers” are constrained by the sequential character of data processing that makes the solving of non-polynomial (NP)-hard problems difficult or potentially impossible in reasonable timeframes. These types of computationally intense problems are commonly observed in software verifications, scheduling and logistics planning, integer programming, bioinformatics and financial portfolio optimization.

D-Wave’s Solution
D-Wave develops quantum computers that are capable of processing data quantum mechanical properties of matter. This leverage of quantum mechanics enables the identification of solutions to some non-polynomial (NP)-hard problems in a reasonable timeframe, instead of the exponential time needed for any classical digital computer. D-Wave sold and installed its first quantum computing system to a commercial customer in 2011.

Nanotechnology Factor
To function properly, D-wave processor requires tight control and manipulation of quantum mechanical phenomena. This control and manipulation is achieved by creating integrated circuits based on Josephson Junctions and other superconducting circuitry. By picking superconductors, D-wave managed to combine quantum mechanical behavior with macroscopic dimensions needed for hi-yield design and manufacturing.

It seems D-Wave has made some research and funding strides since I last wrote about the company in a Jan. 19, 2012 posting, although there is no mention of quantum computer sales.

Over 2000 nanotechnology businesses?

Nanowerk has announced a new, free feature: their Nanotechnology Company Directory. From the July 1, 2010 news item,

At the latest count, over 2100 companies from 48 countries are involved in nanotechnology research, manufacturing or applications – a number that keeps growing at a considerable pace.

With more than 1100 companies, the U.S. is home to roughly half of all nanotechnology firms. 670 companies are in Europe, 230 in Asia and 210 elsewhere in the world. Within Europe, Germany is represented with 211 companies, followed by the U.K. with 146 companies.

Over 270 companies are involved in the manufacture of raw materials such as nanoparticles, nanofibers and -wires, carbon nanotubes, or quantum dots. More than 340 companies are active in life sciences and pharmaceutical fields. The vast majority with well over half of all companies are involved in manufacturing instruments, devices, or advanced materials and components.

The news item goes on to provide a definition for what constitutes a nanotechnology company which is timely in light of Dexter Johnson’s June 30, 2010 posting (What Is a Nanotechnology Company Anyway?) at Nanoclast,

I stopped for a moment after reading [in an investment notice he’d received] this term “nanotechnology company” to consider what might actually constitute such a thing. Is Toyota a nanotechnology company as some nanotechnology stock indices have claimed? Is IBM a nanotechnology company because they are doing research into using graphene and carbon nanotubes in electronics? How about all the instrumentation and microscopy companies that give us the tools to see and to work on the nanometer and angstrom scale, are they nanotechnology companies? What about the flood of nanomaterials companies that started making carbon nanotubes in their basements that were going to revolutionize industry?

Despite figures ranging from one to three trillion dollars being dangled in front of people’s faces for the last 10 years, it doesn’t seem to have attracted the level of investment that would really make a difference in advancing the commercial aspirations of nanotechnologies if the recent PCAST meeting is any indication.

So the definition has an impact since entrepreneurs need to attract investment and, as  more than one of the participants in the recent PCAST meeting noted, moving the discoveries from the laboratory to the market place is a labourious process where there is a significant dearth of investment interest for a phase described as the ‘valley of death’ or, as one participant termed it, the ‘lab gap’. (My post about that particular PCAST meeting ‘The Golden Triangle workshop’  is here.)

The same day Nanowerk announced its new nanotechnology company directory, Christine Peterson at the Foresight Institute posted an item about a venture capital group known for investing in nanotech and microsystems,

Small investors who want to invest in nanotech startups have for years turned to publicly-held venture group Harris & Harris Group, which has focused on private companies in nanotech and microsystems.

With the economy down, and initial public offerings (IPOs) more rare, this strategy is changing.

Peterson is commenting on a Wall Street Journal blog posting by Brian Gormley,

In a June 28 letter to shareholders, Chief Executive [of Harris & Harris Group] Douglas Jamison said many of its private holdings are maturely nicely. Even so, volatility and risk aversion in the public markets are making it difficult for these companies [nanotech and microsystems] to go public.

Although the firm plans to continue investing in private companies, “We currently do not plan to make an initial equity investment in a private company until we get increased visibility into the timing of liquidity for our privately held portfolio,” Jamison wrote in the letter.

The firm, which has 31 private investments in its portfolio, expects to gain such visibility later this year. Jamison was not available for comment Monday.

“With the lengthening time between investment and return on investment in private venture capital-backed companies, we need to find a way to generate returns with greater frequency,” Jamison said in the letter.

“As a public company, we should not count on investors to wait five years between liquidity events. We will seek to position our investments so that we can demonstrate positive returns on investments on an annual basis.”

The valley of death or lab gap seems to be getting wider while venture capitalists who do know the industry pull back. Meanwhile, a standard investor is likely to experience confusion about what the term nanotechnology company means and just how much that ‘market’ is liable to worth.

Interview with Dr. David T. Cramb; venture capital and nano and microsystems; NanoBusiness Alliance roundtable; science and artists

March 3, 2010, I posted about Dr. David Cramb, director of the Nanoscience Program and professor in the department of Chemistry at the University of Calgary, and his colleagues. They had just published a paper (Measuring properties of nanoparticles in embryonic blood vessels: Towards a physicochemical basis for nanotoxicity)  in Chemical Physics Letters about a new methodology they are developing to measure the impact of nanoparticles  on human health and the environment. Dr. Cramb very kindly answered some email questions about the study (abstract is here, article is behind a paywall).

  • Is this work on nanoparticles and blood vessels part of a larger project? i.e. Is this an OECD project; is there going to be an international report; is this part of a cross-Canada investigation into nanoparticles and their impact on health?

This is a collaborative project, but the reports that we generate will be available to Environment Canada and Health Canada. We have collaborators from both agencies.

  • In reading the abstract (for the article, which is behind a paywall and probably too technical for me), it seemed to me that this is a preliminary study which sets the stage for a nanoparticle study. In fact, you were studying quantum dots (CdSe/ZnS) and establishing that a particular kind of spectroscopy could be used to track the accumulation of nanoparticles in chicken embryos. Is this correct? And if so, why not study the nanoparticles directly?

A quantum dot is a type of nanoparticle.  So, in principle, we can apply our techniques to any other nanoparticle of interest.

  • What does CdSe/ZnS stand for?

cadmium selenide (in the centre of the nanoparticle) / zinc sulfide (coating on the outside)

  • What kind or kinds of nanoparticles are going to be used for the study moving forward from this one?

Similar but different sizes and surface chemistries. We want to understand what properties affect uptake into tissues and distribution in organs. That way we can predict risk.

  • From reading the abstract (and thanks to the person who wrote the explanation), I have a pretty good idea why chicken embryos are being used. [I’ll insert the description from the abstract here with attributions.] In another context, I have come across the notion that chickens in the US at least, I don’t know about Canada, have been so thoroughly compromised genetically that using their embryos for research is problematic. (brief note: I attended a lecture by Susan Squier, a noted academic, who had a respondent [a US scientist] claiming he moved to the UK because he didn’t feel confident experimenting with US chicken embryos.) What are your thoughts on this?

We aren’t doing genetic studies, so knowing the lineage of the embryos isn’t critical for us.

  • Is there anything else you’d like to add?

Nanoparticles are being used in many areas from cosmetics to pharmaceutical to energy. As yet, there is no evidence that the nanoscale formulation adds any risk to these applications. We in nanoscience believe that we must maintain due diligence to asess future risk and to make nanotechnology as green as possible.

Thank you Dr. Cramb for taking the time to explain your work.

On a completely other front, Harris & Harris Group a venture capital group that invests in nanotechnology and microsystems is holding a fourth quarter conference call on Friday, March 12, 2010.  From the Harris & Harris Group website,

With over 30 nanotechnology companies in our portfolio, Harris & Harris Group, Inc., is one of the most active nanotechnology investors in the world. We have funded companies developing nanoscale-enabled solutions in solid state lighting, emerging memory devices, printable electronics, photovoltaics, battery technologies, thermal and power management, next-generation semiconductor devices and equipment, quantum computing, as well as in various life-science applications of nano-structured materials.

We consider a company to fit our investment thesis if the company employs, intends to employ or enables technology that we consider to be at the microscale, nanoscale or smaller and if the employment of that technology is material to its business plan. We are interested in funding entrepreneurs with energy, vision and the desire to build great companies.

From the news release on CNN announcing the conference call,

The management of Harris & Harris Group, Inc. (Nasdaq:TINY) will hold a conference call to discuss the Company’s financial results for its fiscal fourth quarter and full year 2009, to update shareholders and analysts on our business and to answer questions, on Friday, March 19, 2010, at 10:00 a.m. Eastern Time.

For details about accessing the webcast, please follow the link to the news release.

Still on business-related nanotechnology news, the NanoBusiness Alliance will be holding its annual Washington, DC roundtable, March 15-17, 2010. From the news item on Nanowerk,

The NanoBusiness Alliance, the world’s leading nanotechnology trade association, today announced that it will convene numerous nanotechnology industry executives in Washington, D.C. from March 15 – 17 for its 9th annual “Washington DC Roundtable”. As in past years, NanoBusiness Alliance members will participate in three days of high-level meetings with Members of Congress, Administration officials, and key staff.

If you are interested in the NanoBusiness Alliance, their homepage is here.

For today’s almost final entry, I’m going back to science and its relationship to art, a topic alluded to just prior to my introduction of the Cheryl Geisler (dean of the Faculty of Communication, Art and Technology at Simon Fraser University, Canada) interview. At the time I noted that art, science and technology are interconnected to justify my inclusion of art topics in this blog and, specifically, my inclusion of the Geisler interview. I just read an entry by David Bruggeman (Pasco Phronesis blog) which describes the impact that art can have. From the post,

… McCall’s art is certainly an influence on why I’m involved with science and technology today. You may not know it, but it’s likely you’ve seen his work in connection with reports on space, or in works of science fiction for the page or the screen …

McCall is Robert McCall, an important space artist who recently died. His website is here and Bruggeman provides other links to McCall’s works.

This bit has nothing to do with anything other than I’ve always thought thought Emma Peel was Steed’s (The Avengers) best partner and found this tribute (clips of Diana Rigg as Peel set to The Kinks) on Raincoaster here. (Scroll down the page.)