Did you know that the first electric cars appeared in the mid-19th century? The earliest electric vehicles used single-use batteries, and the “charging technology” of the time simply involved replacing depleted batteries with new ones. However, electric vehicles weren't particularly popular due to the limited availability of electricity.
The electrification of homes in the early 20th century made this invention more accessible to the general public. At that time, 38% of all cars in the United States were electric. These cars were charged either with onboard batteries or by removing the battery from the car, charging it elsewhere, and then reinstalling it.
In those early days, General Electric introduced the first charging stations, known as “Electrants.” These resembled telephone booths set up around major U.S. cities, and EV users could use them to charge their vehicles.
As the automotive industry evolved in the 1920s and roads improved in both quality and quantity, electric vehicles—with their limited range—were no longer practical for travel, which significantly limited their purpose. Lower gasoline prices during this time also contributed to the rise of internal combustion engine vehicles.
By the late 20th century, public awareness of air pollution grew, and the idea of producing electric vehicles gained renewed interest. Car manufacturers began developing new electric vehicle models, which brought attention to the need for proper charging infrastructure.
The earliest of these modern EVs could be charged at home using a standard household outlet. But as companies started producing plug-in electric vehicles, the demand for a robust public charging network grew—bringing us into the 21st century and the ongoing evolution of electric vehicles and charging technologies.