Free Alternatives to Gaussian

'Free software' here not means 'libre software'.

1. GAMESS

Most similar to Gaussian, and code is really clean and easy to read, even have documents for developers.

Opensource License
Yes BSD

2. PSI

An Open-Source electronic structure program emphasizing automation, advanced libraries, and interoperability.

Opensource License
Yes GNUv3

3. LAMMPS

Has potentials for solid-state materials (metals, semiconductors) and soft matter (biomolecules, polymers) and coarse-grained or mesoscopic systems. It can be used to model atoms or, more generically, as a parallel particle simulator at the atomic, meso, or continuum scale.

Opensource License
Yes GPLv3

4. CASTEP

Specializes in periodic systems with plane wave basis sets.

Opensource License
No commercial

5. ACES

Opensource License
Yes GPLv3

Specializes in high level quantum chemistry calculations.

Taken the best features of parallel implementations of quantum chemistry methods for electronic structure.

6.DIRAC

Opensource License
No Unknown

Oriented towards relativistic quantum chemistry problems.

7. NWChem

Opensource License
Yes ECL

Can calculate a smaller set of properties but it can handle mixed QM/MM calculations and periodic systems like solids.

2017/11/11 posted in  Material

Advanced Materials Through AI & Computational Materials Science

A recent Nature article examines how materials researchers are using artificial intelligence to make quantum-mechanical calculations in only a few seconds that once took supercomputers hours to complete.

These computer modeling and machine-learning techniques are generating enormous libraries of materials candidates. Researchers hope that this approach will produce a giant leap in the speed and usefulness of materials discovery. British materials scientist Neil Alford observes, “We are now seeing a real convergence of what experimentalists want and what theorists can deliver.”

The most promising results so far have been achieved with lithium compounds, used in batteries and other applications..

The Nature article also argues that “artificial intelligence will help researchers comb through vast numbers of materials to find just the one they need for the application at hand.” The standard process starts with researchers applying machine learning to lab data and computer modeling in order to extract common patterns and predict new materials. Researchers then look for a material with specific properties and pass along their findings to chemists, who try to produce the theoretical material for testing.

Personally, I think huge opportunities are available from these types of materials databases — the potential is almost limitless. The advances made so far remind me of the robotic discovery efforts at Dow, the advances made by Bristol-Myers Squibb and other pharmaceutical companies, and recent virus discoveries, such as the ones made by Angie Belcher’s group. These discoveries have resulted in everything from catalysts for oxidative coupling of methane to battery electrode materials. These types of efforts are the physical analog to the computational approaches described in these material databases.

Transforming computer predictions to real-world technologies, however, is difficult. Existing databases include a small fraction of all the known materials and only a few possible ones. Researchers have also learned that data-driven discovery works well for some materials, but not for others. And even when researchers successfully isolate a material with potential, it can take years for chemists to synthesize it in a lab.

Despite these challenges, researchers remain confident that they will discover many useful materials that could lead to innovations in electronics, robotics, healthcare, and other fields. In my opinion, the key for researchers is to avoid the scattershot approach. If scientists can try everything, how do we decide where to focus our efforts? To focus the research there must still be brains behind the computational or robotic synthesis efforts. We need to ensure that we aren’t trying to boil the ocean.

I believe that success will require collaboration between different disciplines and groups. For example, people who understand the computational work may not completely understand the physical impact on materials. We must combine those two areas to provide meaningful information that can be used to impact physical materials. Information inside a computer is only useful if we can translate it to the physical world.

2017/5/19 posted in  Material

Inevitable Comparison: TCP vs UDP

We use Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) to transfer data over the internet.

TCP is the most commonly used protocol because it offers a lot of built-in features such as connection, error-checking and ordering. Also packet delivery is guaranteed.

UDP is also one of the most used protocol. While TCP offers a lot of features, UDP just provides packet throwing. There is no connection, error-checking, ordering etc.

Before talking about use cases, let’s look at their features.
Connection

  • TCP: Connection-oriented (persistent)
  • UDP: Connectionless (fresh air)

Reliability

  • TCP: Reliable (Ordered, Guaranteed)
  • UDP: Unreliable (Drop, Disordering possiblities)

Weight

  • TCP: Heavy (Background mechanisms)
  • UDP: Light (Simply throw packets)
    Transport

  • TCP: Stream (Continous, Ordered)

  • UDP: Datagram (Unrelated delivery)
    Flow Control

  • TCP: Windowing, Congestion Avoidance

  • UDP: Nothing
    Speed

  • TCP: Slow (Resending, Recovering, Error-checking etc.)

  • UDP: Fast (Nothing)

We use TCP for important data because it has reliable and persistent pipeline. For example HTTP (Web), FTP (File), SMTP (Email), SSH (Terminal), SQL (DB Queries) built top of TCP.
We use UDP for unimportant, temporal data because there is no consistent mechanism for reliability or persistance. For example games, VoIP services, media streaming, broadcasting built with UDP.

Choosing the right protocol depends on your needs. Most of developers use TCP because it does pretty much everything built-in also it’s easy as file i/o. My suggestion is use TCP for less frequent, more important data; use UDP for more frequent, less important data.

I tried to tell you basic differences between TCP and UDP protocols but there is one more thing to understand (where the magic begins!): They both developed on Internet Protocol (IP). TCP provides ‘connection’ but connection is an illusion! There is a three-way handshake for connection establishment. Simply, TCP is UDP with advanced features. There were some good developers, they implemented useful solutions for industry needs. Did you ever wanted to go deep into connection establishment, reliability mechanisms? Do you want to implement your own TCP-like protocol?

2017/3/11 posted in  Network

Differences between TLS 1.2 and TLS 1.3

The current version of TLS, TLS 1.2, was defined in RFC 5246 and has been in use for the past eight years by the majority of all web browsers. Companies such as Cloudflare are already making TLS 1.3 available to their customers.

With the release of TLS 1.3, there are promises of enhanced security and speed. But how exactly do the changes from TLS 1.2 to TLS 1.3 cause these improvements? The following is a list of differences between TLS 1.2 and 1.3 that shows how the improvements are achieved.

This protocol was defined in an Internet Draft in April of 2017. TLS 1.3 contains improved security and speed. The major differences include:

• The list of supported symmetric algorithms has been pruned of all legacy algorithms. The remaining algorithms all use Authenticated Encryption with Associated Data (AEAD) algorithms.

• A zero-RTT (0-RTT) mode was added, saving a round-trip at connection setup for some application data at the cost of certain security properties.

• All handshake messages after the ServerHello are now encrypted.

• Key derivation functions have been re-designed, with the HMAC-based Extract-and-Expand Key Derivation Function (HKDF) being used as a primitive.

• The handshake state machine has been restructured to be more consistent and remove superfluous messages.

• ECC is now in the base spec and includes new signature algorithms. Point format negotiation has been removed in favor of single point format for each curve.

• Compression, custom DHE groups, and DSA have been removed, RSA padding now uses PSS.

• TLS 1.2 version negotiation verification mechanism was deprecated in favor of a version list in an extension.

• Session resumption with and without server-side state and the PSK-based ciphersuites of earlier versions of TLS have been replaced by a single new PSK exchange.

In short, the major benefits of TLS 1.3 vs that of TLS 1.2 is faster speeds and improved security.

Speed Benefits of TLS 1.3

TLS and encrypted connections have always added a slight overhead when it comes to web performance. HTTP/2 definitely helped with this problem, but TLS 1.3 helps speed up encrypted connections even more. To put it simply, with TLS 1.2, two round-trips have been needed to complete the TLS handshake. With 1.3, it requires only one round-trip, which in turn cuts the encryption latency in half. This helps those encrypted connections feel just a little bit snappier than before.

tls1.3

Another advantage of is that in a sense, on sites you have previously visited, you can now send data on the first message to the server. This is called a “zero round trip.” (0-RTT). And yes, this also results in improved load time times.

Improved Security With TLS 1.3

A big problem with TLS 1.2 is that it’s often not configured properly it leaves websites vulnerable to attacks. TLS 1.3 now removes obsolete and insecure features from TLS 1.2, including the following:

  • SHA-1
  • RC4
  • DES
  • 3DES
  • AES-CBC
  • MD5
  • Arbitrary Diffie-Hellman groups — CVE-2016-0701
  • EXPORT-strength ciphers – Responsible for FREAK and LogJam

Because the protocol is in a sense more simplified, this make it less likely for administrators and developers to misconfigure the protocol.

2016/2/19 posted in  Network