You people have the right idea, but not the right implementation.
As someone with extensive electronics experience, allow me to clarify.
What you see in movies is Hollywood, not reality. Especially in "high-tech" movies about computers, hackers, etc. Film directors are NOT electrical engineers, so don't assume that what you see can really be done.
First of all, Giza is correct with several points. The loop current on a terrestrial line is reduced to a tiny little sensing current if the phone is left off-hook for a long time.
How long? VERY long... think more along the lines of 10 - 15 minutes! (At least in my area with my local telephone provider.) After the recorded "please hang up" message repeats about 10 times, a VERY LOUD pulsing tone similar to a fast busy signal is played for several minutes. I don't think anyone wants to wait that long before making a cellular data call. That being said, leave the terrestrial phone line out of this!
Giza also mentioned that the acoustic coupler will introduce harmonic distortion. Well, it could very well be so bad you may not even be able to use such a contraption past 300 baud FSK! You'd be much better off using the headset jack on the cellular telephone to interface directly with the cell phone's audio circuits.
Giza is also correct in mentioning that the impedance won't match. This can be fixed with some basic electronic circuitry, but if you don't understand how to interface op-amps and design RC networks, forget about pursuing this project. If however this sort of electronics design is childs play for you, there's more...
We still need to figure out a way of connecting the cell phone to the modem. We've already established that using a terrestrial phone line and telephone is NOT going to work well, or even at all. The problem is, terrestrial telephones use the same pair of wires for both the SEND and RECEIVE audio. Separating this requires a very carefully balanced impedance network which almost exactly matches the impedance characteristics of the modem.
Obviously no telephone line in existence is a perfect match, but because the SNR of a cellular connection is much worse than a terrestrial (wireline) connection, we don't have much room for error here. Your best bet? Forget about trying to make a line hybrid. (That's what the impedance balancing network is called which separates the SEND and RECEIVE audio).
Instead, disassemble the modem, and trace it's DAA circuit. (Data Access Arrangement. It's the analog circuitry between the phone line jack and the A/D converters.) Disconnect the A/D converters from the DAA's hybrid network, but at a point where the A/D converter chip input and output is still buffered (if possible.) Note that on most modern modems, the DAC (digital to analog converter), ADC (analog to digital converter), and DSP (digital signal processor) are all on the same IC.
Now that you've got separate SEND and RECEIVE audio lines which never cross or mix together, it's up to you to design a suitable impedance matching and level matching circuit to connect to the cell phone's headset jack.
Oh yeah, don't forget de-coupling capacitors or high-quality audio transformers. You're dealing with two devices that may be at different ground potentials, with one of the devices bleeding RF all over the place. Have fun with this one, and may the force be with you.
Alright, let's say you've mastered the art of RF electronics and all the "black magic" associated with it. You've successfully modified your modem (or built a damn good impedance matching hybrid that simulates a real-world phone line) and connected it to your cell phone. Signals levels are right where they should be, SNR is great and THD is nice and low. You're still not done... Now you have to wrestle with the cellular telephone and how it's engineers designed it.
Cell phones are designed for voice, not modem tones. The microphone circuitry of the cell phone will likely have filtering to cut low frequencies, boost the mid, etc. The cell phone will also have an AGC (automatic gain control) circuit on the microphone so that when you're yelling in the heat of an argument, your victim will still hear you relatively undistorted. And so that when you're quietly whispering to your partner in crime about phone phreaking, your partner will actually hear you over the phone! Both these types of circuits may wreak havoc on modem tones, which are VERY sensitive to phase error (unless you're perfectly happy with 300 baud).
If the whole phase error thing is confusing you at this point, go read about FSK, PSK, QPSK, and QAM. That's a whole other topic which is beyond the scope of this explanation. If you understand how those modulation schemes work, you'll see why PSK, QPSK, and QAM (which are used for all connection rates faster than 300 baud) need near-perfect phasing.
Finally, this may have been a cool project in the 1980's and early 1990's, but I highly doubt you could do it now. Why? Because it only works with analog phones. Since 99% of the cellular telephone population uses digital phones these days, 99% of the population can't even use a project such as this one.
Here's the scoop: digital cell phones convert the sound into a digital bitstream, then apply a lossy compression algorithm to reduce the data rate. When the audio is restored at the other end of the connection, it does NOT match the original audio! And guess what... the phase angles of all the component frequencies are the first thing to be tossed in the garbage with lossy compression! The very thing which modem tones need to remain intact! See our ears and brains aren't very sensitive to phase error, so to humans the audio still sounds pretty close. But a modem trying to make sense of it won't stand a chance. (Unless you're using an FSK modulation. In other words, 300 baud.)
So let's say you've got a "dual mode" cell phone which can be switched into analog mode. You might be in luck, or you might not. A lot of modern phones, even if they can use analog cellular networks, still pass the audio through a DSP (digital signal processor) before it goes anywhere. If the DSP has aggressive filtering, AGC, or possibly even dynamic range compression, there's a good chance the modems won't be able to connect.
If you're really persistent and insist on making this work, here's a little tip: put the phone into "TTY" or "TDD" mode. This sets the headset jack to industry-standardized levels and impedance, and turns off side-tone. (Side-tone is the term referring to "hearing yourself" in the handset when you speak. For humans, it adds comfort because it doesn't feel like you're talking into a dead telephone. For modems, it's problematic because it's just one more bit of noise that the modem has to try to filter out.) If you can find it, download the "TIA/EIA-PN-3-4558-RV1 Rev. A of TIA TSB121" specification. There's lots of good information to work from on there, particularly about the electrical characterics of the 2.5mm headset jack in TTY (or TDD) mode.
There's still a few issues to be considered on the modem too...
Normally by default, the modem waits for a dial-tone before calling, so you have to disable this. As someone already mentioned, you can use the ATX0 command (that's a zero, not the letter O), but there's a better alternative: ATX3. ATX3 will disable the wait for dial-tone, but will retain things like busy signal detection. (ATX0 will disable all of the modem's intelligence at recognizing telephone signalling.)
You may also want to increase the amount of time the modem spends trying to establish a connection, since the modem will have a lot of different protocols to try before it finally finds something that works on such a contraption. This time value (in seconds) is usually stored in register S7. Type ATS7=n , where n is the number of seconds.
Consider forcing your modem to use lower connection speeds. Hell will become a very cold place before you'll get a 56k connection over a cell phone, so there's no point in even letting your modem try it. It'll just make the connection process take longer... perhaps too long and time-out. Some modems are also buggy and won't bother trying all the way down to their lowest speeds (which may be the best you can expect for this!) unless you force them to use only low speeds.
Also, you may want to increase the modem's tolerance to line noise... specifically, loss of carrier. Most modems will drop the connection if they lose track of the carrier tone for just 1 or 2 seconds. Increase this to at least 4 or 5 seconds. This time value (in tenths of a second) is usually stored in register S10. Type ATS10=n, where n is the number of seconds x10.
Ultimately, the modem with the lowest S7 and S10 settings determines the actual timeouts between the two modems. Even if you set your modem to allow 25.5 seconds of garbled audio before it finally drops the connection (ATS10=255), if the remote modem answering the call only allows for 2, then 2 or more seconds of noise is all it takes to become disconnected!
------------
Now, before anyone argues with all this, allow me to speak from experience.
I have tried this.
The results?
- Analog cordless telephone (closest thing I had to an all-analog cell phone at the time): modems connected at 9600 bps, sometimes 12,000 bps.
- Dual mode cellular telephone in analog TTY mode: modems connected at 4800 bps, sometimes 7200.
- Dual mode cellular telephone in analog voice mode: modems connected at 2400 bps.
- Digital GSM cellular telephone in TTY mode: modems connected at 300 bps FSK.
- Digital GSM cellular telephone in voice mode: modem sometimes connected at 300 bps FSK, connection was very unreliable.
This was done using active impedance matching circuits with isolation transformers and de-coupling capacitors, parametric equalizers to compensate for the cell phone's tonal characteristics, and a modified modem where the SEND and RECEIVE audio never crossed paths.
A lot of work for something that never achieved very impressive results.
I guarantee you that an old telephone's microphone and earphone held up against a cell phone with masking tape will NOT work.