Dmitry Melkin and Pavel and Boris Lonkin had no questions about who to take to the team to participate in robot battles. The guys knew each other from Baumanka, then together they assembled and installed solar power plants. One day Dmitry saw an advertisement for a robotics competition and applied. Friends supported the initiative, and a month later the first combat robot of the Solarbot team, Brontosaurus, stood in the garage.
Brontosaurus weighed a whole centner and, as its creators now admit, was not distinguished by either reliability or ingenious design solutions. No wonder: it was collected partly on a whim, partly from unclear screenshots from videos from the English Robot Wars competitions.
After Brontosaurus, having counted and remade the main components several times, Dmitry, Boris and Pavel assembled their second robot. Behind external resemblance with a shell it was called Shelby, from the English shell - “shell”. Shelby, the son of difficult mistakes, first won everyone at the “Battle of Robots - 2016” in Perm, organized by the Moscow Institute of Technology (MIT) and the Promobot company, and then, together with the cars of two other Russian teams, became a participant in international competitions in China. Its creators tell us how the winning robot works and what it took to make it.
"Our greatest pride is chassis Shelby. We tinkered with the chassis of its predecessor literally after every battle. When we made the Shelby, the chassis was ground out, rebuilt and reassembled many times, but now you can completely forget about it. In future projects, we will only have to work on maintaining reliability and increasing power. It would be nice, for example, if our new robot could move not one, but two enemy robots at once.”
Shelby's chains are from mopeds, its wheels are from a racing kart, and its electric motors are from radio-controlled model cars. Parts for combat robots are not produced, so you have to look for them at flea markets and on the Internet. Good parts are very expensive, and designers tend to make them themselves.
“Shelby is a flipper type. It is equipped with a pneumatic system that forcefully pushes the lid upward. This is the robot’s main weapon and its method of stabilization: having tipped over, it can turn over with one jerk and stand on its wheels. But we couldn’t create high pressure in the pneumatic cylinder to make the lid blow powerfully - we didn’t have the necessary valves. There was only one thing left to do: make the system work as quickly as possible. The solution turned out to be simple: we got rid of excess hydraulic resistance and modified the factory valves. In the future, of course, a valve will be needed high pressure. A ready-made one is expensive, about 200 thousand rubles, so now we are thinking about our own design.”
Combat robots are not a cheap hobby: you need at least 200-300 thousand rubles plus Consumables, spare tires and everything that breaks and is replaced in battle. And this does not take into account the time and labor spent. “To assemble a robot, a team of three people needs to stop going to work for two months,” laugh Solarbot engineers. You won’t be able to save even on electronic stuffing.
“The great thing about Shelby electronics is that there are very few of them. In order not to pick up a soldering iron after each battle, you need to provide the robot with the necessary minimum of “brains”. Shelby has simple factory controllers and only the valves are controlled by a small circuit board. It is very difficult to disable it. Even when in China, instead of the usual lead batteries, we were given powerful lithium ones and the wires failed after a couple of minutes, the robot’s electronics were not damaged.”
Speed up to 25 km/h Force on the pneumatic cylinder rod 2 t Engine power 2.2 kW Pneumatic shock capacity without changing the cylinder 30−35 Remote control The rules of the “Battle of Robots” prohibit any detachable parts and sheet metal, so Shelby does not shoot or wave anything, and its body is made only of a metal profile.
The Solarbot team has built a tough iron soldier, but it also has a breaking point. In China, he suffered from the rotating knives of Chinese spinners, in Perm - from the claws of a matanga robot, which with eight tons of force cuts a metal profile like butter. There are lacerations on his iron ribs. The creators are preparing the fate of an exhibit for him: he will participate in festivals (summer Geek Picnic is in the near future), and in the arena he will be replaced by a new fighter - also a flipper, only faster, more powerful and even more reliable. The lifting force of the lid will be twice that of the Shelby, the motor power will increase from 2.2 to 2.8 kW, and the speed will increase. With the new robot, the Russian team dreams of going to Robot Wars in England.
But the future flipper is not the ultimate dream of Solarbot. Now Dmitry is negotiating with other teams and looking for sponsors: if everything works out, then the first “megabot” will appear in Russia - as big and formidable as the Japanese, American and Chinese multi-ton monsters.
Thanks to the support of the Moscow Institute of Technology, the Russians for the first time entered the international tournament of combat robots FMB Championship 2017 in China. The battle was hosted by Shelby, Kazan Destructor and St. Petersburg Energy, which advanced to the semi-finals.
WikiHow works like a wiki, which means that many of our articles are written by multiple authors. This article was produced by 14 people, including anonymously, to edit and improve it.
Have you ever wanted to build a fighting robot? You probably thought it was too expensive and dangerous. However, most robot fighting competitions have a 150 gram weight class, including RobotWars. This class is called "Antweight" in most countries and "FairyWeight" in the USA. They are much cheaper than large combat robots and not as dangerous. Therefore, they are ideal for beginners in the field of fighting robots. This article will tell you how to design and build an Antweight class combat robot.
NOTE: This article assumes that you have already read and built a simple radio controlled robot. If not, come back and at first do it. It should be noted that this article Not is a recommendation for using a specific part of your robot. This is necessary to encourage creativity and diversity among robots.
Understand the rules. Before designing a robot for competition, you must understand all the rules. They can be found most important rule builds to watch out for are the size/weight requirements (4"X4"X4" 150 grams), and the metal armor rule which states you can't have armor more than 1mm thick.
What weapon will you use? An important part of a combat robot is the weapon. Come up with a weapon idea, but make sure you stay within the rules. For your first antweight bot, it is highly recommended to use a "flipper" or even a "pusher" (one who pushes). Flipping weapons, if designed correctly, can be the most effective weapons in the Antweight class. Push weapons are the simplest, as they are not moving weapons. The entire robot acts like a weapon and pushes robots around. This is effective since the rules state that half of the arena must be without walls. You will be able to push another robot out of the arena.
Select your details. Yes you need choose your details before design. However, don't buy them. Bye. Simply select the parts and the corresponding project. If something doesn't fit or doesn't work while you're designing, you'll save money because you can still replace the parts. And again, Not Buy the parts now!
Collect characteristics. Now that you have selected all the parts, you need to take down the dimensions and weights. They should be listed on the website you purchased them from. Convert all values in inches to millimeters using the converter. Write down the specifications (in mm) of all your parts on a piece of paper. Now, convert the weight values (ounces, pounds) to grams using the converter. Write down the weight characteristics on paper.
Design. You want the project to be as accurate as possible. This means that you should try making a 3D design on a computer rather than a 2D design on paper. However, a 3D design doesn't have to look complicated. A simple prism and cylinder project will do.
Order your parts. If all your components match your design flawlessly, order the parts. If not, select new parts.
Collect it. Now you need to assemble your chassis/armor. Place all your components in the locations specified in your design. Connect everything and test. You should try to assemble everything so that you can easily remove the components if they need to be replaced. And components will need to be replaced more often than a regular robot, as this robot will fight. Attacking robots can damage yours. It is recommended to use Velcro tape to store the parts.
Practice management. It doesn't matter how good your robot is, if you fall, you lose. Before you even think about competing, you need to practice management. Use upside down cups as cones and drive around them. Use the foam as targets and attack it (try this on a small table to practice pushing and try not to fall yourself). You could even buy a cheap RC car (on a different frequency with your robot), have another person drive it, and try to push or destroy the car without falling over. If you know another person with an Antweight robot, have a friendly fight with him (if possible, replace the spinning weapons with less destructive plastic ones).
Compete. Find competitions in your area and have fun destroying other robots! Remember that if you are going to compete in the US, you should look for Fairyweight competitions, not Antweights.
The rules state that the robot must fit into a 4X4X4 inch cube, however it can be expanded using remote control. You can benefit from this. For example, your flipping weapon sticks out too much. Try to design it so that the flipper can go straight up and be less than four inches tall. But when the flipper is lowered (after the cube is raised), the length will become more than four inches.
A large gathering of scientists, industry leaders and NGOs have launched a campaign to stop killer robots, dedicated to preventing the development of combat autonomous weapons systems. Among those who signed up were: Stephen Hawking, Noam Chomsky, Elon Musk and Steve Wozniak.
These big names are generating a lot of attention and lending legitimacy to the fact that killer robots, once considered science fiction, are in fact fast approaching reality.
An interesting study published in the International Journal of Cultural Research takes a different approach to the idea of "killer robots" as a cultural concept. Researchers argue that even the most advanced robots are just machines, like everything else humanity has ever made.
“The thing is, killer robot’ as an idea didn’t come out of thin air,” said co-author Tero Karppi, an assistant professor of media theory at the University at Buffalo. “This was preceded by methods and technologies to make thinking and development of these systems possible.”
In other words, we worry about killer robots. The authors explore the theme of killer robots in films such as The Terminator or I, Robot, in which they theorize that far in the future, robots will end up enslaving the human race.
“Over recent decades, the expanded use of unmanned weapons has dramatically changed warfare, bringing new humanitarian and legal challenges. There has now been rapid advancement in technology, resulting from efforts to develop fully autonomous weapons. These robotic weapons will have the ability to select fire on a target independently, without any human intervention."
The researchers respond that these alarmist dystopian scenarios reflect a “techno-deterministic” worldview, where technological systems are given too much autonomy, which could be destructive not only to society, but to the entire human race.
But what if we coded machine intelligence in such a way that robots couldn't even tell the difference between a human and a machine? It's an intriguing idea: if there is no "us" and "them" there can be no "us versus them."
Indeed, Karppi suggested that we may be able to control how future machines will think about people on a fundamental level.
If we want to make changes in the development of these systems, now is the time. Simply ban lethal autonomous weapons and address the root causes of this dilemma. To truly avoid the development of autonomous killing machines.
We rode the unmanned Yandex. Taxi" at Skolkovo, military engineers figured out how to adapt unmanned vehicle technologies to create new weapons.
In reality, technology is not quite what it seems. The problem with all technological evolution is that the line between commercial robots “for life” and military killer robots is incredibly thin, and it costs nothing to cross it. For now, they are choosing a route, and tomorrow they will be able to choose which target to destroy.
This is not the first time in history that technological progress has called into question the very existence of humanity: first, scientists created chemical, biological and nuclear weapon, now - “autonomous weapons”, that is, robots. The only difference is that until now weapons were considered inhumane." mass destruction- that is, not choosing whom to kill. Today, the perspective has changed: a weapon that will kill with particular discrimination, choosing victims according to its own taste, seems much more immoral. And if any warlike power was stopped by the fact that if it used biological weapons, everyone around them would suffer, then with robots everything is more complicated - they can be programmed to destroy a specific group of objects.
In 1942, when American writer Isaac Asimov formulated the three laws of robotics, all of which seemed exciting, but completely unrealistic. These laws stated that a robot could not and should not harm or kill a human being. And they must unquestioningly obey the will of man, except in cases where his orders would contradict the above imperative. Now that autonomous weapons have become a reality and may well fall into the hands of terrorists, it turns out that programmers somehow forgot to put Asimov’s laws into their software. This means that robots can pose a danger, and no humane laws or principles can stop them.
The Pentagon-developed missile detects targets on its own thanks to software, artificial intelligence (AI) identifies targets for the British military, and Russia displays unmanned tanks. To develop robotic and autonomous military equipment V various countries Colossal amounts of money are spent, although few people want to see it in action. Like most chemists and biologists, they are not interested in their discoveries ultimately being used to create chemical or biological weapons, and most AI researchers are not interested in creating weapons based on it, because then a serious public outcry would harm their research programs.
In his speech at the beginning of the United Nations General Assembly in New York on 25 September Secretary General Antonio Guterres called AI technology a "global risk" along with climate change and rising income inequality: "Let's call a spade a spade," he said. “The prospect of machines determining who lives is disgusting.” Guterres is probably the only one who can urge the military departments to come to their senses: he previously dealt with conflicts in Libya, Yemen and Syria and served as High Commissioner for Refugees.
The problem is that when further development robots will be able to decide for themselves who to kill. And if some countries have such technologies and others do not, then uncompromising androids and drones will predetermine the outcome of a potential battle. All this contradicts all of Asimov's laws at the same time. Alarmists may be seriously worried that a self-learning neural network will get out of control and kill not only the enemy, but all people in general. However, the prospects for even completely obedient killer machines are not at all bright.
Most active work in the field artificial intelligence and machine learning today is carried out not in the military, but in the civilian sphere - in universities and companies like Google and Facebook. But most of These technologies can be adapted for military use. This means that a potential ban on research in this area will also affect civilian developments.
In early October, the American non-governmental organization Stop Killer Robots Campaign sent a letter to the United Nations demanding that the development of autonomous weapons be limited at the international legislative level. The UN made it clear that it supported this initiative, and in August 2017 Elon Musk and participants joined it International conference United Nations on Artificial Intelligence (IJCAI). But in fact, the United States and Russia oppose such restrictions.
The last meeting of the 70 countries party to the Convention on Certain Conventional Weapons (inhumane weapons) took place in Geneva in August. Diplomats were unable to reach consensus on how global politics in relation to AI can be implemented. Some countries (Argentina, Austria, Brazil, Chile, China, Egypt and Mexico) expressed support for a legislative ban on the development of robotic weapons; France and Germany proposed introducing a voluntary system of such restrictions, but Russia, the USA, South Korea and Israel have stated that they have no intention of limiting the research and development being carried out in this area. In September, Federica Mogherini, the European Union's senior official on foreign policy and security policy, said guns “affect our collective security“, therefore, the decision on the issue of life and death should in any case remain in the hands of man.
Cold War 2018
Officials American defense believe that autonomous weapons are necessary for the United States to maintain its military advantage over China and Russia, which are also investing in similar research. In February 2018, Donald Trump demanded $686 billion for the country's defense in the next fiscal year. These costs have always been quite high and decreased only under the previous President Barack Obama. However, Trump - unoriginally - argued the need to increase them by technological competition with Russia and China. In 2016, the Pentagon budget allocated $18 billion for the development of autonomous weapons over three years. This is not much, but here you need to take into account one very important factor.
Most AI developments in the US are underway commercial companies, so they are widely available and can be sold commercially to other countries. The Pentagon does not have a monopoly on advanced machine learning technologies. The American defense industry no longer conducts its own research the way it did during the cold war“, but uses the developments of startups from Silicon Valley, as well as Europe and Asia. At the same time, in Russia and China, such research is under the strict control of defense departments, which, on the one hand, limits the influx of new ideas and the development of technology, but, on the other, guarantees government funding and protection.
According to experts from The New York Times, military spending on autonomous military vehicles and unmanned aerial vehicles aircrafts will exceed $120 billion over the next decade. This means that the debate ultimately comes down not to whether to create autonomous weapons, but to what degree of independence to give them.
Today, fully autonomous weapons do not exist, but Vice Chairman of the Joint Chiefs of Staff General Paul J. Selva of the Air Force said back in 2016 that within 10 years the United States will have the technology to create weapons that can independently decide who and when to kill. And while countries debate whether to restrict AI or not, it may be too late.
Clearpath Robotics was founded six years ago by three college friends who shared a passion for making things. The company's 80 specialists are testing rough-terrain robots like Husky, a four-wheeled robot used by the US Department of Defense. They also make drones and even built a robotic boat called Kingfisher. However, there is one thing they will never build for sure: a robot that can kill.
Clearpath is the first and so far only robotics company to pledge not to create killer robots. The decision was made last year by the company's co-founder and CTO, Ryan Garipay, and in fact attracted experts to the company who liked Clearpath's unique ethical stance. Ethics of robot companies Lately comes to the forefront. You see, we have one foot in the future where killer robots exist. And we are not yet ready for them.
Of course, there is still a long way to go. Korean Dodam systems, for example, is building an autonomous robotic turret called Super aEgis II. It uses thermal imaging cameras and laser rangefinders to identify and attack targets at a distance of up to 3 kilometers. The US is also reportedly experimenting with autonomous missile systems.
Military drones like the Predator are currently piloted by humans, but Garipay says they will become fully automatic and autonomous very soon. And this worries him. Very. “Deadly autonomous weapons systems could be rolling off the assembly line now. But lethal weapon systems that will be made in accordance with ethical standards“They’re not even in the plans.”
For Garipay, the problem is international rights. In war, there are always situations in which the use of force seems necessary, but it can also endanger innocent bystanders. How to create killer robots that will take right decisions in any situation? How can we determine for ourselves what the right decision should be?
We are already seeing similar problems in the example of autonomous transport. Let's say a dog runs across the road. Should a robot car swerve to avoid hitting a dog but putting its passengers at risk? What if it’s not a dog, but a child? Or a bus? Now imagine a war zone.
“We can't agree on how to write a manual for a car like this,” says Garipay. “And now we also want to move to a system that should independently decide whether to use lethal force or not.”
Peter Asaro has spent the last few years lobbying for a ban on killer robots in international community, being the founder International Committee on control of robotic armies. He believes that the time has come for “a clear international ban on their development and use.” This, he says, will allow companies like Clearpath to continue making cool stuff "without worrying that their products could be used to violate people's rights and threaten civilians."
Autonomous missiles are of interest to the military because they solve a tactical problem. When remote-controlled drones operate in combat environments, for example, the enemy often jams the sensors or network connection so that the human operator cannot see what is happening or control the drone.
Garipay says that instead of developing missiles or drones that can independently decide which target to attack, the military should spend money on improving sensors and anti-jamming technology.
“Why don't we take the investments that people would like to make to build autonomous killer robots and put them into making existing technologies more efficient? - he says. “If we set the challenge and overcome this barrier, we can make this technology work for the benefit of people, not just the military.”
Recently, conversations about the dangers of artificial intelligence have also become more frequent. worries that runaway AI could destroy life as we know it. Last month, Musk donated $10 million to artificial intelligence research. One of the important questions about AI is how it will merge with robotics. Some, like Baidu researcher Andrew Ng, worry that the coming AI revolution will take people's jobs away. Others like Garipay fear it could take their lives.
Garipay hopes that his fellow scientists and machine builders will think about what they are doing. That's why Clearpath Robotics took the side of the people. “While we as a company can’t put $10 million on it, we can put our reputation on it.”
nanbaby.ru - Health and beauty. Fashion. Children and parents. Leisure. Life House