Weird Acceleration Calculation

I was bored so i tried calculating acceleration for the first time. I tried calculating a 4 motor tank drive. Never did it before, but the result i got seemed way too high

I used force that i got from
torque/Radius=Force

and plugged it into
Force/Mass= Acceleration (newtons 2nd law)

from four 393 motors, got 58.4 inch pounds of stall torque. I converted that to newton meters for about 6.60. Then I converted 2 inches into meters for about 0.05. meters. I figured out force and that gave me 132 newtons

From there, I took 13.1 pounds for a robot and converted it to kilograms. That gave me 5.94 kilograms.

Using Force and mass, i did 132/5.94 for about 22.2 meters. Once I convert that to inches, it gives about 874 inches. Is it because i replaced net force with force?

Metres are not a measure of acceleration.

(132 Newtons) / (5.94 kilograms)

= (132 kilogram metres per square second) / (5.94 kilograms)

= 22.2 metres per square second

= 2.3 g

Yes, this is a lot. But what you should remember is that as the motors speed up, their torque output decreases. The torque output of the motors will decrease linearly with robot speed until it balances the friction the robot experiences. The robot can’t just accelerate at 2.3 g forever.

After reading my message, I could tell how unfocused I was. Sorry for reading this rat’s nest of a post:o

were you referring to the part I said?

If so, I accidentally cut off meters per second squared…:o If not, i’m lost on where I messed up dimensional analysis.

But as a confirmation, does that mean I calculated acceleration for the instant the robot starts driving, but the value of acceleration will drop as a robot starts speeding up? And I’m assuming theoretical top speed can be determined by

circumference of the wheel x free speed of a motor in seconds x gear ratio?

also, if torque decreases linearly as speed increases exponentially, does that mean I can take torque at it’s halfway point to determine the average acceleration?

This is a highly non linear problem but can be solved numerically in Excel. I am typing this from my mobile but will try to give you the gist.

Force = mass * acceleration

Force = torque / wheel radius

However, a motors torque is based on its speed by the following equation:

Torque = (-stall torque / free speed) * speed + stall torque

Remember that acceleration = change in speed over time

If you use a constant time step then accel = (current speed - previous speed) / delta time

You can solve 2 equations with 2 unknowns and get your accel vs time curve.

If you have questions I can answer them when I am back at my desk.

Paul

I understand that much, however…

this is where things are getting vague. I’ve never seen that equation before, but it interests me. by speed, do you mean the velocity of the robot at a specific instant in time?
Also, based off vex, would the free speed always be 100 and the stall torque always be 14.6 or 8.6 if the motors are programmed at the full +/- 127?
Unfortunately, I dont’ know much excel… do you know a link off the top of your head I can use? I can probably search for myself later today

Thank you both for your help:)

It’s a straight line equation and is shown on the motor speed-torque graphs I posted before as the pink line.

https://vexforum.com/showpost.php?p=310291&postcount=2

When speed is 0, torque is stall torque.
When speed is not 0 then torque decreases linearly until (theoretically) at it’s maximum speed (free speed) it would be 0.

free speed we show as 100 on the graphs, it’s usually a little higher than that, perhaps 110 rpm.

The one thing missing from these equations is frictional loss.

I think i’m beginning to have a better grasp how motors function now… i think i need a few more days to fully grasp this though… this thread actually turned all of what i thought i knew, and put it on it’s head:D

Before, i use to think that motors could exert around 10 inch pounds of torque, but they also reached high rpms, just not exactly 100. it never really clicked that as motors reach stall torque, they beginning spinning slower, and as they reach free speed, torque isn’t being used.

but what causes this to happen though? opposing force? programming?

The short answer is “back-EMF” - the effect that the faster the motor spins, the the higher emf (voltage) it generates against what the battery/controller supplies; dc motors also work as generators.

Below the free speed, the back-emf is less than the input voltage to the motor, so the motor accelerates. At the free speed, the back-emf is equal to the input voltage so you get no change in speed (assuming no load or friction). If you physically turned the motor faster than the free speed (maybe with a bigger faster motor :p) then let it go, back-emf would be greater than the input voltage and the motor would decelerate to the free speed.

I was surprised I couldn’t readily find good explanations of it but here is one link, hopefully others can provide more.

Don’t get confused between this and programming - programming a lower input value essentially has the effect of lowering the pink line on jpearman’s graph.

Of course putting more mechanical load on the motor also slows it down - this is a movement along the graph lines as jpearman explained.