To answer the OP's question, yes that is how you would measure the current being drawn from the battery. Your meter needs to be set to measure DC amps, leads should be plugged into the appropriate ports on the meter and the meter should be rated to higher than what you'd expect to read. Most decent meters will measure upto 10A, but if you expect higher than that, it will blow the meters fuse (as will connecting the leads up wrongly.) Current can be measured between either the earth or live lead and the corresponding battery terminal, whichever is easier, and a -ve reading just means the current flow is in the opposite direction to conventional flow, red to black (you've got the leads the wrong way round but it's irrelevant.)
Converting power (watts) to current (amps) using Ohms law is all well and good in the classroom (or for sizing fuses) but it assumes that the stated power is accurate and that the voltage is fixed, we know both will not be true.
Nominal battery voltage is 12.6v, 12.4v would be fairly normal but a battery could quite easily sit at over 13v coming off of charge due to surface charge, try to crank over an engine and it might not even click. If measured voltage drops much below 10v then there's something wrong (could just be partially discharged) this is hugely temperature dependant, cold reducing chemical activity etc. Relays will stop operating at about 7.5v so if the battery drops this low during cranking, it won't start no matter how much you turn it.
Remember also that the amp hour rating of a battery is actally the 20hr rating, ie the notional rating that would discharge it to 10.5v (flat) over 20 hours. For example a 40Ah battery would sustain a 2A load for 20 hours, if you then placed a 40A load on the same battery, it would not sustain for 1 hour! As the load increases, the Ah rating decreases, you'd probably only be looking at a little over 30mins at 40A discharge and a lot of waste heat.