Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Student Invention May Significantly Extend Mobile Device Battery Life

Soulskill posted more than 5 years ago | from the wireless-to-wireless dept.

Power 160

imamac writes with this excerpt from news out of Carleton University: "Atif Shamim, an electronics PhD student at Carleton University, has built a prototype that extends the battery life of portable gadgets such as the iPhone and BlackBerry, by getting rid of all the wires used to connect the electronic circuits with the antenna. ... The invention involves a packaging technique to connect the antenna with the circuits via a wireless connection between a micro-antenna embedded within the circuits on the chip. 'This has not been tried before — that the circuits are connected to the antenna wirelessly. They've been connected through wires and a bunch of other components. That's where the power gets lost,' Mr. Shamim said." The story's headline claims the breakthrough can extend battery life by up to 12 times, but that seems to be a misinterpretation of Shamim's claim that his method reduces the power required to operate the antenna by a factor of about 12; 3.3 mW down from 38 mW. The research paper (PDF) is available at the Microwave Journal. imamac adds, "Unlike many of the breakthroughs we read about here and elsewhere, this seems like it has a very high probability of market acceptance and actual implementation."

Sorry! There are no comments related to the filter you selected.

Counter-intuitive! (5, Insightful)

4D6963 (933028) | more than 5 years ago | (#26179879)

Wow, is it me or does it feel profoundly counter-intuitive that you'd lose more power over the wire than over radio waves?

Re:Counter-intuitive! (2, Interesting)

Cylix (55374) | more than 5 years ago | (#26179955)

I don't think he separating the amplifier from the antenna, but perhaps feeding the amplifier directly attached to the antenna. The loss in signal from source to antenna from the distance of the run has to be made up. This is done by stepping up the output of the amplifier stage.

This configuration isn't uncommon and many microwave systems employ this technique. (Attaching the amplifier nearly directly to the antenna.)

Though I would have to look a bit at the design this is only item I can think of. From nearly every phone I have busted the antenna is usually separated quite a bit from the rest of the components.

Impedance Matching technique (3, Informative)

Anonymous Coward | more than 5 years ago | (#26180163)

"This configuration isn't uncommon and many microwave systems employ this technique. (Attaching the amplifier nearly directly to the antenna.)"

I agree, it sounds very much like some kind of Impedance Matching technique where the Inductive coupling is direct to the antenna. I'm not so sure that's as patentable as this University is drumming it up to sound. (I guess they hope to earn a lot of money from it, mainly from from phone companies). But Impedance Matching using windings to effectively wireless couple to the antenna (where the antenna acts like part of the winding) isn't something new. If anything its something very old.

In other news (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26181027)

Atif Shamim said of his technology, "Praise Allah! God willing, my technology will make more efficient and longer-lasting IED's for my buddies in Iraq! Yes to Allah! No to infidels!"

Re:Counter-intuitive! (3, Interesting)

linzeal (197905) | more than 5 years ago | (#26179975)

There is many an order of magnitude more atoms in the tracing on the PCB than comprise the air the radio waves travel through from the antenna on the cell phone to the cell tower. There are even less when we are talking a matter of mm. The more atoms you have to push your information through the more amperage it takes to overcome the resistance [wikipedia.org] and since radio waves are a form of EM radiation they follow similar laws which just appear more complicated [wikipedia.org] .

Re:Counter-intuitive! (1, Interesting)

timmarhy (659436) | more than 5 years ago | (#26180055)

umm doesn't air have a lower conductivity than copper, hence electricity runs happily along copper at low voltages but needs 1000 volts to jump just 1 cm through the air? TFA is hopeless, it almost sounds like he cut the wires on his iphone, which stopped it transmitting then declared a major break through in battery life.

Re:Counter-intuitive! (3, Informative)

sillybilly (668960) | more than 5 years ago | (#26180473)

We're talking superhigh frequencies near 1 GHz. At such frequencies all of the electric/magnetic field generated "current" runs on the surface of wires anyway, not through the bulk, due to "skin effect". Or the electric/magnetic field can simply propagate through free space as electromagnetic radiation, like microwaves in your microwave oven, or light through empty space. Light propagates better through vacuum than through a copper wire, doesn't it?

Re:Counter-intuitive! (0, Insightful)

Anonymous Coward | more than 5 years ago | (#26180663)

Dude...speaking as an RF Engineer, I think you should just declare yourself as "officially mentally retarded"...

Re:Counter-intuitive! (5, Funny)

Anonymous Coward | more than 5 years ago | (#26180477)

yes, once we figure out how to overcome the resistance quality of air I envision a new age where we can have wireless like youtube service.

I will call this great thing television.

Re:Counter-intuitive! (5, Interesting)

Plekto (1018050) | more than 5 years ago | (#26180173)

They also do this in recording studios. It takes far less power and wiring(or can be done via RF or IR) to have each speaker have its own small amplifier than to try to power the whole room with a rack of giant units.

This also would create less interference, believe it or not, since running wires near live electrical components(even the tiny components in a circuit board make a difference - just stick an AM radio near your computer's motherboard) tends to cause interference. This is the other reason recording studios do this. They can run a very heavily shielded or wireless line level signal to each speaker directly. Less power, less clutter, less interference.

Re:Counter-intuitive! (4, Informative)

Anonymous Coward | more than 5 years ago | (#26180289)

Powered speakers are popular because it gives monitor manufacturers a way to make line level crossovers, power amps and speaker drivers work together.
Having control over the specifications of all those components means better fidelity. It is tidier too.

I don't think RF or IR is ever used with studio monitors. They would cause phase alignment problems and a loss of fidelity. Simpler is better, so people use wires. Anyway, aren't we trying to avoid RF transmitters here?
Speaker cables can be shielded too, but people don't bother as any interference would be imperceptible.

Power loss in speaker cables is pretty tiny too. Powered speakers really are all about convenience and potential better fidelity.

Re:Counter-intuitive! (3, Insightful)

Plekto (1018050) | more than 5 years ago | (#26180523)

They use wireless just fine with mics and pickups and so on on stage for these reasons all the time. Less cables, less problems, and also if you've ever had to deal with grounding issues, wireless or a line-level signal that's amplified at the source is a huge improvement. I suspect that's the real problem here - too much background RF noise from the components. Rather than brute-forcing it, he decided to find a way to get around this and clean up the signal in the process.

Btw, most pros don't use wired mics any more. Too many issues. Most studios don't use non-powered speakers any more, either. You're right - I haven't found many setups that use IR or wireless(yet), but I can find many professional systems that use S/PDIF, optical, or other non-analog transmission methods.(shoot, most home theater interconnects are now HDMI for exactly these sorts of reasons.

Re: wired vs wireless audio signals (5, Informative)

Anonymous Coward | more than 5 years ago | (#26180703)

Not true.

Wired mics sound better because they lack the companders involved in transmitting the audio signal. Performers like wireless because it's convenient, not because it sounds better. Those concerned with sound quality stick to wired.

Balance signals use common mode rejection to eliminate induced noise. This has been standard practice for years. Recording studios used either balanced wiring, or digital in the form of AES or optical ADAT.

Re:Counter-intuitive! (1)

Jeff DeMaagd (2015) | more than 5 years ago | (#26180723)

I'm lost on how the antenna in a phone is a major power consumer. Aren't the screen, power converters, CPU and all the modulators in the radios each consuming more power than the wire that connects the transceiver to the antenna? If it's really consuming that much power, then it stands to reason that wire should burn up.

The article is short on details and so poorly worded that I think the article should not have been published. Even if it's valid, the writing makes it look like pseudoscience.

Re:Counter-intuitive! (3, Interesting)

Plekto (1018050) | more than 5 years ago | (#26180827)

The problem is that the antenna isn't a major power consumer. It's that the signal path between the circuitry and the antenna is so full of junk on many models due to poor slapped-together designs that the signal must be boosted a lot to communicate with the local cell phone tower. In the old days this wasn't a problem as there weren't major limits on power. Some old Analog units transmitted as much as 10-20W!. Now they have to limit their power to a fraction of that. If the digital signal can't be boosted enough to communicate and it's already at that FCC imposed limit, you're out of luck. No bars. Technically you never actually get "no bars" - you just get too little for the error correction to work any more.

Re:Counter-intuitive! (5, Informative)

crowtc (633533) | more than 5 years ago | (#26179981)

I'm not an antenna designer, but by the looks looks of it, the design is basically a miniature on-chip waveguide, efficiently channeling the RF energy toward the external antenna, minimizing wasted radiation.

Wires radiate RF like mad unless they're heavily shielded, which is something you really can't do effectively in tight spaces. Of course, testing was done at 5.2GHz, so it will be interesting to see how it works at cellphone frequencies - packaging size might become a factor at lower frequencies.

Re:Counter-intuitive! (2, Insightful)

thebes (663586) | more than 5 years ago | (#26180699)

Umm, dude...just because you shield a component doesn't mean it stops radiating. Shielding inhibits EM fields which are already present. To reduced radiated losses, you need to either improve the fundamental design of the circuit or make it radiate so well that you build an antenna instead.

Re:Counter-intuitive! (1, Informative)

Anonymous Coward | more than 5 years ago | (#26179997)

From the article:
"The strategy is useful as it eliminates the need of isolating buffers, bond pads, bond wires, matching elements, baluns and transmission lines. It not only reduces the number of components and simplifies SiP design but also
consumes lower power."

Less compenents = Less power?

Re:Counter-intuitive! (1)

thebes (663586) | more than 5 years ago | (#26180713)

Yes. Every connector, isolator, circulator, switch, filter, duplexer, wire, conductor, etc. contributes to the losses in the circuit. As much as half your power can be lost after the final power amplifier (more than that and you need lessons in radio design, or you need to adjust your requirements).

Re:Counter-intuitive! (2, Informative)

e9th (652576) | more than 5 years ago | (#26180077)

From the research paper: [mwjournal.com]

The conventional LTCC package provides 3 times more range than the proposed design but consumes 12 times more power.

So you save power versus the conventional design, but you lose range.

Re:Counter-intuitive! (1)

zippthorne (748122) | more than 5 years ago | (#26180175)

So? You just bump the power up on the new design. 3^2 = 9, so the new design is actually claimed to be 33% more efficient.

Still, that's not zero percent.

How about that inverse-square law? (2, Insightful)

Anonymous Coward | more than 5 years ago | (#26180187)

From the research paper: [mwjournal.com]

The conventional LTCC package provides 3 times more range
than the proposed design but consumes 12 times more power.

So you save power versus the conventional design, but you lose range.

To provide the same signal strength at triple the range, you need to broadcast 9 times as much power. To broadcast 9 times as much power with an equally compact transmitter, is it surprising that you need to spend 12 times as much power due to size/efficiency trade-offs?

This doesn't sound like an advance at all.

Re:How about that inverse-square law? (4, Informative)

mako1138 (837520) | more than 5 years ago | (#26181015)

You are assuming an isotropic emitter, where field strength falls off as 1/r^2. That behavior is invalid for other antennas; for example a dipole's field strength falls off as 1/r (in the far-field approximation). The paper is complicated by the fact that the radiation patterns of the antennas used in this paper are directional and different. The "conventional" chip used a folded dipole with a "boresight radiation pattern", and the "proposed" chip used a custom design with a front-to-back ratio of 10dB.

Table 1 has the numbers:
Module Type / Power Consumption / Gain / Range

Standalone
TX chip / 3.3 mW / -34 dBi / 1 m

TX chip in
conventional
LTCC package / 38 mW / -1 dBi / 75 m

TX chip in
proposed LTCC
package / 3.3 mW / -2.3 dBi / 24 m

Let's do some reckless hand-wavy extrapolation. The difference in power is 38/3.3 = 11.5 = 10.6 dB; if we assume perfect scaling of the new package to 38mW, we'd expect 10.6-2.3=8.3 dBi. This is an improvement of 9.3 dB over the conventional method -- it's almost 10 times as efficient.

This analysis ignores, among other things, the relative directionalities of the antennas. I wonder why they didn't choose a more directional antenna for the "conventional" chip, or used the same sort of antenna in order to do a level comparison.

The other point of comparison is between the "standalone" chip and the "proposed" chip. A 32 dB improvement with no power increase is nothing to sneeze at!

Re:Counter-intuitive! (5, Funny)

Anonymous Coward | more than 5 years ago | (#26180465)

No, you don't get it: wires is how you lose power. Try disconnecting your battery and see how long it lasts then!

In fact I should do my PhD on that.

Re:Counter-intuitive! (2, Informative)

camperslo (704715) | more than 5 years ago | (#26180937)

The summary is misleading.

The paper describes a method of simply and efficiently coupling energy from the transmitter VCO chip to the main antenna, making good use of the R.F. energy that chip provides. It seems that most of the power savings is from avoiding (power used by) an external buffer amplifier by eliminating the amplifier.
That's great if the chip can provide sufficient output power, and if the spectral purity is good enough to comply with F.C.C. or other requirements. I'd expect that most cell phones need more transmit power than provided by the example in the paper, but perhaps the same methods are viable with higher power modules.

Note that the power savings only occurs in transmit mode, and the savings is only in the circuit providing signal to the antenna. Something like an iPhone has a bunch of other electronics and a display using considerable power, none of which is affected by the changes proposed in the paper.

What's presented is innovative but in reality isn't likely to do much for the overall power consumption of a complex product like an iPhone. The savings would be more likely to amount to something in smaller and and much simpler devices, more along the lines of battery powered WiFi or BlueTooth products.

Nice (1)

bytethese (1372715) | more than 5 years ago | (#26179887)

I like the idea of using my iPhone for days at a time between charges. Heck, maybe would provide enough battery for a useful iPhone/GPS unit.

Re:Nice (1)

Tubal-Cain (1289912) | more than 5 years ago | (#26180003)

I doubt that the antenna makes up the majority of your iPhone's power usage.

Re:Nice (1)

bytethese (1372715) | more than 5 years ago | (#26180097)

The 3G radio actually sucks enough juice that Apple gave us a toggle option to use EDGE instead. :)

Re:Nice (1)

Tony Hoyle (11698) | more than 5 years ago | (#26180483)

..and switch wifi off (which is even more power hungry, btw.). 3G is only more power hungry in weak areas (since it'll try to find the weak 3g antenna rather than the more powerful 2g one).. in an area of good reception it makes no difference.

But cellphone antennas are already pretty power efficient compared to driving the display, backlight etc... and let's not even get started on the GPS. You aren't going to get multiples of battery life just from this invention.

Re:Nice (1)

thebes (663586) | more than 5 years ago | (#26180751)

No, "cellphone antennas" are not "power efficient". I assume you actually mean the radio (antennas [the antenna itself, not the other components you are incorrectly referencing] are generally passive...they do have a radiating efficiency, but they generally don't consume power in the classical sense). Cellular transmitters (base station and mobiles) are usually only in the 25-35% range of efficiencies. This is the result of high peak to average ratios in the signal which require the amplifier to be oversized by as much as 10x to ensure the FCC and other similar bodies will certify the equipment. In general, if your mobile is actually transmitting lets say 50 mW (the power reaching the antenna), the transmit chain would likely be consuming at least 150 mW (likely more).

Wrong calculator button (1)

kpainter (901021) | more than 5 years ago | (#26179897)

I think this joker hit the '+' button when he meant to hit the '-' button. 12 times. I don't think so.

This Sounds Like a Great Idea (5, Funny)

WaxlyMolding (1062736) | more than 5 years ago | (#26179919)

...until you consider the security ramifications.

Re:This Sounds Like a Great Idea (5, Funny)

narcberry (1328009) | more than 5 years ago | (#26179969)

Yeah, he'd basically short-range broadcasting his long range broadcast. If you got within several feet of him and used the right equipment, you might be able to listen in on everything he's broadcasting!

Re:This Sounds Like a Great Idea (5, Funny)

ODiV (51631) | more than 5 years ago | (#26180025)

So put a Faraday cage around it?

Re:This Sounds Like a Great Idea (5, Funny)

zygotic mitosis (833691) | more than 5 years ago | (#26180321)

This has become a costly way of talking to yourself, then. Crackheads on the bus have a simpler method.

Re:This Sounds Like a Great Idea (1)

kpainter (901021) | more than 5 years ago | (#26180037)

No because he claims it is 12 times more efficient. If that is true, you would have work 12 times harder to listen to what would get radiated anyway. This guy has figured out a way to patent a matching network.

Re:This Sounds Like a Great Idea (-1, Offtopic)

mustafap (452510) | more than 5 years ago | (#26180027)

Some idiot who didn't get your joke marked you as Insightful. That's actually funnier your joke!

I love slashdot

Re:This Sounds Like a Great Idea (1, Funny)

Anonymous Coward | more than 5 years ago | (#26180067)

The ramifications of sending data a short distance to the antenna, which is then relayed a much longer distance to the base station...yeah, I'm sure those hackers are gonna pull your data off your antenna from this connection rather than the antenna's connection to the tower

Re:This Sounds Like a Great Idea (5, Interesting)

lysergic.acid (845423) | more than 5 years ago | (#26180207)

what are the security ramifications? that a 3rd party might be able to intercept the wireless transmission just like they already can? whether you use this technique or not, you're still going to be broadcasting the signal wirelessly. that's why GSM signals are supposed to be encrypted.

the GSM encryption was broken earlier this year [forbes.com] . the security ramifications of that are far more serious. why would you be worried about someone intercepting this weak wireless signal when attackers can already eavesdrop on your conversation from miles away?

heck, if they're close enough to intercept this signal, then they're already within earshot of you. they wouldn't need to intercept the wireless signal to the antenna. anyone silly enough to do so would look rather conspicuous standing there with a laptop and a directional antenna pointed at your phone.

Re:This Sounds Like a Great Idea (0)

Anonymous Coward | more than 5 years ago | (#26180727)

And RFID has no security implications of walking past a reader...

Someone riding a subway, airplane, etc...

Re:This Sounds Like a Great Idea (-1)

bill0755 (692856) | more than 5 years ago | (#26180257)

...until you consider the security ramifications.

Ah, nothing like an amateur security expert with no brains and too little information. Think about it nit wit. If it wasn't secure on the way to the antennae, how in the hell would it be secure when it was broadcast or received! Find another hobby besides spreading FUD. I'm tired of you already.

Re:This Sounds Like a Great Idea (2, Funny)

BronsCon (927697) | more than 5 years ago | (#26180657)

w....
wh...
who?

**WHOOSH**

Re:This Sounds Like a Great Idea (0)

Anonymous Coward | more than 5 years ago | (#26180987)

I believe you have the first signs of Alzheimers

What? (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26179927)

The explanation given on the website is very poor. The resistance of the wires connecting the transceiver and the antenna is low and little power is lost in them.

In addition, they quote him as saying "There are so many applications in the iPhone, itâ(TM)s like a power-sucking machine" but what they're talking about is the power lost at the antenna and not from the processor which is what he implies. Therefore it wouldn't do anything to prolong battery life when using non-transmitting applications.

Perhaps this is a case of announcing something without giving away what it really is or perhaps pathetic technology journalism?

Re:What? (2, Informative)

evanbd (210358) | more than 5 years ago | (#26180059)

Definitely bad journalism. The culprit isn't wire resistance, it's reactance. The impedance mismatch at the junctions from amplifier to circuit board to connector to cable to antenna all create reflections and thus standing waves [wikipedia.org] . The power that goes into those standing waves is reflected back into the amplifier, where it is dissipated as heat. The result is that you need (in his example) a 38mW amplifier in order to get 3.3mW of radiated power out of the antenna.

What his invention does is create a near-field transmission to the antenna directly from the amplifier output, without all that intervening cable and PCB trace and such. Near-field antennas can be efficient at *much* smaller sizes, so you can put one on the chip. It's counterintuitive to me that you could get lower losses that way, but that's what he's claiming. Multi-GHz radio waves (microwaves) behave in weird ways, and I'm not an RF engineer...

Re:What? (1)

Joce640k (829181) | more than 5 years ago | (#26180491)

Can't that be fixed by better wiring?

I'm sure the cell phone engineers aren't idiots, the impedance mismatches in existing phones will be minimal.

Re:What? (0)

Anonymous Coward | more than 5 years ago | (#26181099)

Absolutely! That's why I upgraded the wiring in my cell phone with Monster Cables. The neutrino spectragraphic wave distortion is *SO* much lower now!

Re:What? (5, Informative)

thebes (663586) | more than 5 years ago | (#26180803)

Oh my god. Please not another "informative" post. I really wish you people would stop commenting on these articles when you clearly have no clue what you are talking about. The reflected power (if it happens to exist in this case...which it doesn't because these transmitters are designed quite well and usually include a circulator or isolator at the output of the amplifier to ensure an excellent match) does not go back into the amplifier, because if it did the amplifier would not work as it was designed and would either oscillate or produce extremely poor waveform quality at the output.

Now, if you can bypass the circulator/isolator I mentioned above (which is what I gather they are trying to do in this article) then that is one less place power can be lost on the way to the antenna.

Re:What? (3, Informative)

John Hasler (414242) | more than 5 years ago | (#26180863)

The article is crap. The paper, however, makes sense. Read it.

I don't get it. (2, Interesting)

jcr (53032) | more than 5 years ago | (#26179933)

What's the win here? He's capacitively coupling the transmitter to its antenna, or what?

-jcr

Re:I don't get it. (4, Informative)

Ungrounded Lightning (62228) | more than 5 years ago | (#26180181)

He's using a waveguide coupling to launch the wave to an external hunk of waveguide, rather than running it through pins, wires, PC board traces, etc. The latter are very lossy at cellphone frequencies.

(I'm working on something similar right now and lose virtually all my signal going through about 6" of PC board wiring. B-( )

Re:I don't get it. (1)

Man On Pink Corner (1089867) | more than 5 years ago | (#26180325)

WTF? Even FR-4 only loses about 1 dB/inch at 8 GHz! Spend the bucks for better board material.

This whole article makes no sense at all. Matching networks are not especially lossy at cellphone frequencies.

Re:I don't get it. (0)

Anonymous Coward | more than 5 years ago | (#26180451)

You forget, this is about efficiency. Having 10dB while using a shit load of power is LESS then having 5db using alot less. The article is bad but basically this is about reducing power usage (on a small scale: internally). I'm not too familiar with rf but "3.3 mW down from 38 mW" seems to imply a small design change that affects a subset of a design structure (we are talking low mW so even a great improvement in efficiency means little overall to most). But, if it's cheap and easy to implement, why not do it since it does save power (if what he says is correct).

Re:I don't get it. (5, Interesting)

inca34 (954872) | more than 5 years ago | (#26180223)

"The on-chip antenna feeds the LTCC patch antenna through aperture coupling, thus negating the need for RF buffer amplifiers, matching elements, baluns, bond wires and package transmission lines."

From the systems perspective he made a better RF transmitter block. Digging into that block and looking at the RF design level, he simplified the circuitry normally used such as a matching network for the antenna, transmission lines, oscillator (for modulating the information over the carrier frequency), etc into a discrete chip as opposed to multiple printed circuit board components to do that same job.

Beyond that I'd need to study the paper and find more detailed examples of cell phone architecture to have a better idea of the advantages and disadvantages over the legacy design.

Re:I don't get it. (4, Interesting)

TigerNut (718742) | more than 5 years ago | (#26180565)

Nevermind that he's apparently ignoring the true cause of a lot of the "lost" power - which is in the various bandlimiting filters that any real cellphone pretty much can't do without. It's tough to get a good multiband filter that doesn't have 1 to 2 dB insertion loss. The apertures are also geometric, so you are automatically sensitive to odd-order harmonics in both directions.

And I wonder how his aperture's impedance matches the amplifier out of band? From what I've seen in bleeding-edge RF architectures over the last 20 years or so, it's far easier to make a poor oscillator than a good amplifier, with any given set of components.

Re:I don't get it. (0)

Anonymous Coward | more than 5 years ago | (#26181185)

Actually, the battery consumption is mostly for the display/display electronics on the iPhone

Battery Life (1)

Koshari (1435453) | more than 5 years ago | (#26179989)

There definitely needs to be more research on Battery life....it's advancing slower than the gadgets which causes a ceiling on innovations!

But what % of battery use does it represent? (4, Interesting)

jriskin (132491) | more than 5 years ago | (#26180015)

I mean my phone lasts for days if i don't use it and many hours if i'm just talking. The vast majority of power seems to be used when I'm watching video, playing games, or browsing the web. My guess would be this is more CPU related.

So even if it saves 10x in the transmit/receive it still might only be a 2x overall savings or less. I suppose it depends on usage patterns.

Re:But what % of battery use does it represent? (4, Funny)

Kent Recal (714863) | more than 5 years ago | (#26180105)

I suppose it depends on usage patterns.

Yes. His approach would only help people who use their phones primarily to *gasp* make phone calls. Blasphemy?

Re:But what % of battery use does it represent? (2, Insightful)

morgan_greywolf (835522) | more than 5 years ago | (#26181149)

Or use a Web browser. Phones typically communicate with the Internet through the cellphone network over the two-way radio. This might improve WiFi phones, too, as WiFi also (obviously) employs a (much lower-power) two-way radio.

The radio is a non-trivial part (1)

Sycraft-fu (314770) | more than 5 years ago | (#26180121)

Goes double for WiFi, which is an extremely chatty protocol and thus sucks power. Could make WiFi much more usable in smartphones. Right now, if you play with WiFi much, you'll find that your battery gets drained fast as compared to EVDO or the like.

Re:But what % of battery use does it represent? (1)

IorDMUX (870522) | more than 5 years ago | (#26181203)

The largest battery hog on your phone is the backlight and screen. After that, you have butt-loads of internal RF processing, and then, at a distant third, the antenna itself. The CPU, PMU, etc., are all eating from the same dish, as well. (I suppose if you have an Intel Atom, though, it would be sitting above the RF processor for power consumption.) My estimation on the increase in battery power would be in the range of low to moderate double-digit percentages, but it depends heavily on usage patterns, of course.

I bet... (0)

Anonymous Coward | more than 5 years ago | (#26180043)

I bet This is merely a power reduction for the chip itself.

Something somewhere will still have to use most of that 'saved' power to create the full-strength signal. There might be some savings since the full power transmitter could then be moved right into contact with the antenna - but I doubt the result will be much more than a few extra minutes battery life for devices implementing this design.

Less range by 3x; less power by 12x (2, Informative)

Anonymous Coward | more than 5 years ago | (#26180051)

Last line of the pdf:

The conventional LTCC package provides 3 times more range than the proposed design but consumes 12 times more power.

Re:Less range by 3x; less power by 12x (2, Interesting)

Anonymous Coward | more than 5 years ago | (#26180553)

Exactly. That means that this give exactly zero improvement over the current arrangement. Range goes by the square of power (assuming perfect isotropic radiation). If you reduce the transmit power by 12 times, the range at which the same detected signal level would be measured should drop by a factor 3.46. How is this better? Apples and Oranges. To get a comparison that one is better than the other, they would have to be compared at the same received signal strength at the same range. The fact that these guys admit that they didn't do that puts this paper in the snake oil category.

Also, this only deals with the transmit side of things. In a phone, the antenna is also used to receive signals. Normally a T/R switch is used which has loss. This paper does not include any mechanism for receive circuitry. Given that the oscillator is really part of the antenna, would make incorporating a receiver extremely difficult.

A further concern is the transmit VCO is very tightly coupled to the antenna. The author of the paper cites this as an advantage. I wonder what would happen if I hold this antenna near some metal? It would detune the antenna and therefore cause the VCO to detune. This is called Load-pull and is always undesirable.

This scheme has no harmonic filter whatsoever. The pesky FCC makes you test this. Ironically, the Dept of Industry in Canada is even worse in this regard than the FCC. I doubt that this would pass those requirements.

I am no chip designer..... (1)

dindi (78034) | more than 5 years ago | (#26180065)

(only a software engineer) ... but when you tell me that replacing copper wires with a (wireless) transmitter and receiver helps save power: well I am a non-believer. Sorry. Just does not cut it whatever the headlines say. How about quality ?

Re:I am no chip designer..... (4, Interesting)

paganizer (566360) | more than 5 years ago | (#26180195)

For once, something that I'm actually qualified to post on!
I was a Weapons system depot level tech in the navy, doing lots of work with waveguides, radar, etc. I went on to work in the private sector, doing among other things antenna design at Nortel.
I can't help but say this is a bunch of shit. It is ALWAYS more energy-expensive to do wireless, it's just the way things are.
If it is just the journalist making a mistake, I can see some possible advances in energy conservation using a waveguide, or even a virtual waveguide; anything else would only start to be possible if you enter the realm of high energy physics.
Unless this guy's name is Tesla, and/or they have developed a completely new principle...

Re:I am no chip designer..... (1)

Plekto (1018050) | more than 5 years ago | (#26180269)

The real question is how much you have to boost the signal to overcome the interference from the electronics nearby. Since we're talking about a digital transmission, this is very much a factor. Too much background noise and you get garbage at the other end.(not quite like analog wireless). As such, digital cellphones have to boost their signal until they can get a connection. Often, quite a lot, in fact.

You can see this with a HDTV set and an antenna. Too low of a signal and you get no picture at all.

So a heavily shielded antenna and chip with a wireless transmission between them might very well save power, as the signal won't have to be boosted as much to connect to the local cell phone tower. In fact, it might also extend the range, since the real limiting factor of cell phone reception is the FCC and other aggencies' limits on broadcasting power. If you can connect to the local tower/access point with say, half the signal, due to it being less noisy to begin with, then you can get slightly better range as well if you keep the power levels at the old limits.

Re:I am no chip designer..... (1, Informative)

Anonymous Coward | more than 5 years ago | (#26180817)

You are completely full of crap. Digital modulation techniques work at much lower signal to noise ratios than analog methods.

You can see this with a HDTV set and an antenna. Too low of a signal and you get no picture at all.

With an analog signal, you would have seen a very noisy picture as received signal to noise is reduced. Digital digs it out of the noise until it is unrecoverable, all the while, presenting a perfectly clear picture.
The actual signal in both cases is an analog signal. The difference is in the information conveyed. Where digital transmission methods do require more transmit power is where the bandwidth is increased over its analog counterpart. HDTV conveys more information than an NTSC signal while occupying the same bandwidth. Therefore, it is more efficient. In the absence of multi-path, the digitally modulated signal should have increased usable range over the analog modulated signal transmitting the same power.

Re:I am no chip designer..... (1)

Kohath (38547) | more than 5 years ago | (#26180925)

It's much more complicated than you understand. All modern wireless communications are analog -- especially the digital ones.

The AC post is correct.

Re:I am no chip designer..... (5, Interesting)

Moof123 (1292134) | more than 5 years ago | (#26180529)

I'm not as qualified as paganizer, as I usually work at much higher frequencies (mmwave). However, losses from the PA to the antenna are typically pretty low. The claim of 12x improvement imply the current interconnects are at best 8% efficient (utter BS!).

From the PA to the radiated signal you typically have:

1. On PA losses because of their design. For example they typically have at least 3 different output stages to span from just a few milli-watts (single HBT cell), up to full power (hundreds of milli-watts, hundreds of HBT cells). The parasitics of driving the unused cells at less than full power operation creates small losses, but I don't know a hard number for this.

2. Baluns/impedance transforms. PA's are typically class B operation with a load line that is just a few Ohms (3V Vcc, and hundreds of mA of DC power, so the RF loadline is pretty steep). Solutions are matching structures, or a push-pull architecture through a balun to transform up to 50 Ohms. These usually account for 0.5-1 dB of loss (10-20%) of power. The invention ignores this part of a cell phones design.

3. Multi-band switch. Missing in this article is that most phones are designed to operate on at least 2, often 3 frequency bands. Several PA's are used, each designed to cover only one band. A GaAs phemt switch is usually used to switch between the two or more PA die. The invention does not address this aspect of cell phone design. These chips are either integrated in with the PA chip (separate die in the same carrier), or in some cases done in a different chip.

4. Small line loss from the PA chip to the antenna do have modest loss, usually just a few tenths of a dB (few percent). The article addresses this aspect of things.

5. The antenna is a clusterfuck of design hassles, as it is often dual, or tri-band in nature. A lot of compromises go on with the antenna. Making it have multiple resonances to cover the bands is hard. Making it small is hard. Making it work with the crappy ground plane, user's hand and head, and technicolor plastic case is damn hard. The article glosses over all this, and talks about a single narrow band antenna scenario.

Re:I am no chip designer..... (1)

Plekto (1018050) | more than 5 years ago | (#26180705)

But what about the display, the back lighting, the bluetooth/wifi, the internal speaker... I can think of a lot of things in a cell phone that also cause background noise that must be overcome. Those bare traces on the circuit board are essentially also acting like a microphone for any stray RF signals. The mistake I think is that many people are equating this with analog signals. With RF interference with digital signals, it then falls back to how much you can boost the signal to have the error correction still work.

Think of it like a dirty CD that you're trying to get to play. Obviously if you can clean half of the grime off of the surface it'll have less drop-outs and problems. A good player(or cell phone) likely won't care, but a poor one - it can be the difference between getting a signal in a bad area and nothing at all.

Note - a fun thing to do is to put your cell phone near a pocket radio and see how much noise various models generate while even just on standby.

Re:I am no chip designer..... (0, Flamebait)

Vellmont (569020) | more than 5 years ago | (#26180469)


but when you tell me that replacing copper wires with a (wireless) transmitter and receiver helps save power: well I am a non-believer.

Uhh.. and what about being a software developer qualifies you to have a valid opinion on saving power in radio transmission? I'm a software developer as well, and I found it surprising... but it astonishes me that you think you have the ability to have any kind of valid opinion on something so far afield of your area of expertise.

Re:I am no chip designer..... (1)

dindi (78034) | more than 5 years ago | (#26180913)

Well, not sure what kind of software engineer you are if you did not study physics, mathematics, chemistry and economics at your university.

As of now my studies and experience suggests that transmitting whatever over wireless is far more expensive (as in needs more effort) then doing the same thing over a solid connection (copper, aluminium, gold, zinc, silver ..... etc)....

But hey my studies are dated as I finished my IT studies in 1996. Sure with that attitude you are at least ... hmm for 3 years in "the industry" or maybe still at school?

Re:I am no chip designer..... (1)

Vellmont (569020) | more than 5 years ago | (#26181225)


Well, not sure what kind of software engineer you are if you did not study physics, mathematics, chemistry and economics at your university

You have a very strange university where chemistry and physics is part of the software program.

As of now my studies and experience suggests that transmitting whatever over wireless is far more expensive (as in needs more effort) then doing the same thing over a solid connection (copper, aluminium, gold, zinc, silver ..... etc

Thanks. I guess I'll stick with people that actually have experience in the field they're talking about rather than some undergraduate classes in related fields who made a spitball guess at the plausibility of it.

I am a chip designer (0)

Anonymous Coward | more than 5 years ago | (#26180551)

Fine student paper. Marginal improvement in real terms.

Re:I am no chip designer..... (1)

russotto (537200) | more than 5 years ago | (#26181073)

(only a software engineer) ... but when you tell me that replacing copper wires with a (wireless) transmitter and receiver helps save power: well I am a non-believer. Sorry.

You're missing two things

1) High frequency RF is just plain weird and
2) This is all near-field stuff; even at 5Ghz a chip package is substantially smaller than a wavelength.

operate the ***ANTENNA*** (1)

JonTurner (178845) | more than 5 years ago | (#26180085)

If the circuite powering the antenna was the greatest consumer of power in the device, this would result in a significant improvement to the end-user. However, it's all the other bits in the device which eat thousands of times more power -- the CPU, the display, the speakers, etc.

Interesting discovery, but the real-world savings will be few.

Lack of physical connection = Lower noise floor? (1)

SoopahCell (1386029) | more than 5 years ago | (#26180171)

This article is sounding like bad marketing but my guess is having no physical connection means the antenna is isolated completely, and the lack of nearby circuitry reduces the noise floor on what it sends and receives, reducing the necessary power to send a clear signal.

He mentions "other parts" being part of the existing antenna connection... what would those be?

Tuned Antenna (1)

trum4n (982031) | more than 5 years ago | (#26180199)

I can tune a pringles can to broadcast 10 miles on a standard (2.4Ghz, 802.11B/G) wireless router. If it is the SLIGHTEST out of tune, i cant even get a mile. Why not tune a cell antenna? I know for a fact that they didn't even try in my Razr.

Re:Tuned Antenna (1)

Ender Wiggin 77 (865636) | more than 5 years ago | (#26180337)

How do you tune a pringles can?

Re:Tuned Antenna (2, Funny)

dgatwood (11270) | more than 5 years ago | (#26180585)

Same way you tune a fish.

Re:Tuned Antenna (1)

spinlight (1152137) | more than 5 years ago | (#26180843)

How do you can a pringles tune?

Re:Tuned Antenna (2, Funny)

dindi (78034) | more than 5 years ago | (#26180939)

start eating the pringles out of it, that reduces interference with the other components inside. That is a good start IMO

Re:Tuned Antenna (1)

thebes (663586) | more than 5 years ago | (#26180851)

The problem is that if you make the antenna too directive, you may miss out on certain cell sites entirely (as you mention in the example with your pringles can). You may also notice that your pringles cantenna is highly symetrical about a few axes. You also provide a substantial ground plane which results in a design that better approaches theoretical design guidelines.

Cell phone designers have a lot to deal with, your big head, your hand, buildings, etc...in reality, a highly directive antenna on a mobile (that is truly mobile, unlike your cantenna) would be catastrophic in terms of being able to make a call.

Re:Tuned Antenna (1)

trum4n (982031) | more than 5 years ago | (#26181075)

You can tune an omnidirectional antenna. Make the antenna, including the wire from the amp, exactly one wavelength at your resonate, or carrier frequency. This increases your gain massively. Since the antenna element resonates at the carrier frequency, it is like singing in Radio City Music Hall, instead of Yankee Stadium. Radio City is designed to make your voice louder and clearer. Stadiums are ment to hold as many people as possible, with some sort of view.

Re:Tuned Antenna (1)

thebes (663586) | more than 5 years ago | (#26181273)

I am fully aware of how antennas are designed. Tuning the resonant frequency of an antenna should go without saying (though given the other posts on this story, I appreciate the fact that you spoke of what you knew, and limited yourself to just that).

My comment played off the OPs comment about a cantenna (which achieves its performance by increasing directivity). However, while tuning to your carrier frequency is generally accepted as the most basic guideline, you should take it one step further to include the radiation pattern in your overall design (and will ensure other performance characteristics are considered).

Also, true omnidirectional antennas don't exist...

Kind of a misnomer (3, Interesting)

SkOink (212592) | more than 5 years ago | (#26180273)

I don't think this will "significantly extend" mobile device battery life, As other people have pointed out, something that could practically save maybe 10mW of battery power during transmit operation is interesting but not really all that dramatic. On the other hand, the author doesn't appear to make the claim that it will or won't significantly extend battery life. That may be a slashdottism :)

If I understood the abstract right, the gist of this is that he designed a transmit module with a small internal loop antenna, so that a larger transmit antenna could be inductively coupled instead of electrically driven. This means that all of the bias and driver circuitry internal to the transmit chip and also all of the bias and transmit circuitry external to the chip could be done away with. He coupled an antenna to the outside of a microchip to utilized what would essentially be 'waste' magnetic field in a conventional transmitter.

I would also bet that the big boys like Qualcomm probably do something similar already inside of their cell-phone modules. I would imagine that an approach like this eliminates much of the general purpose interfacing that needs to be done between some arbitrary microwave transmit module and some other arbitrary antenna, but things like cellphone transmitter chipsets are so tightly integrated that I bet they already implement something similar.

complete BS (0)

Anonymous Coward | more than 5 years ago | (#26180281)

The loss of energy in a co-ax cable is well known. It is usually measured in dB per 100 feet. Bad cable at 2 GHz might be 100 dB per 100 feet. I leave it to you as an exercise to figure out the loss of two inches of cable.

Re:complete BS (0)

Anonymous Coward | more than 5 years ago | (#26180901)

Do you have any clue what dBs mean? Not to mention that coax cable is a very good transmission line and microstrip generally less so (but it is easy to manufacture in high volumes for low cost).

Re:complete BS (1)

nsaspook (20301) | more than 5 years ago | (#26181131)

The loss of energy in a co-ax cable is well known. It is usually measured in dB per 100 feet. Bad cable at 2 GHz might be 100 dB per 100 feet. I leave it to you as an exercise to figure out the loss of two inches of cable.

Belden RG-174 has about .6dB loss/ft @ 5.6GHz
http://www.belden.com/pdfs/TechInfo/Coax%20Electrical%20Characteristics.pdf [belden.com]
http://www.belden.com/pdfs/03Belden_Master_Catalog/06Coaxial_Cables/06.59_66.pdf [belden.com]

Insignificant power reduction (0)

Anonymous Coward | more than 5 years ago | (#26180445)

He reduced the amount of power lost on the antenna path by 35 mW. The iPhone battery is 1400 mAh at 3.7V, and battery life is rated at 10 hours talk time, therefore the iPhone uses 518mW during active use. This would only increase battery life by 7%.

morons (0)

Anonymous Coward | more than 5 years ago | (#26180515)

There is no massive efficiency increase. Where did you idiots get that idea? The paper doesn't say that there is. It reports a novel and largely useless technique for coupling the transmit power amplifier to the antenna through a small local wireless connection. The proper antenna still does its job. Quote: The chip coupling to LTCC patch antenna improves the TX module gain by 32 dB and range by 23 m as compared to the on-chip antenna alone, without affecting the RF circuit performance and power consumption.
There might be some small efficiency improvement but I doubt it. IT WILL ONLY BE SMALL since the efficiency of the antenna matching circuit if using high Q components (and a decent board and layout) will, if you do it right, be very efficient.

Article pegged my BS filter (1)

xtronics (259660) | more than 5 years ago | (#26180819)

No matter how you transmit the power, you still have to drive the gate capacitance and that takes a little bit of power. My hunch is that the writer didn't understand anything they guy said and was just winging it to the publics detriment.

You can dissipate power by radiating when you don't want to - may ways around this.

Utterly Useless..... (1)

IHC Navistar (967161) | more than 5 years ago | (#26180873)

This idea is pretty useless, since it have been confirmed that cellular companies do not truthfully report the amount of battery life left, so people will make shorter calls and not take up the valuable bandwidth of theirs that they oversold.....

Wireless wireless wireless chargers (1)

GottliebPins (1113707) | more than 5 years ago | (#26180935)

I remember back when we used to have wires going all over the place to connect everything. Then they invented wireless connections and everything had to be plugged in to rechargers and we had wires going all over the place. Then they invented wireless charging bases for all our wireless devices and we had wires going all over the place. Then they invented...

Student? (4, Insightful)

tyrione (134248) | more than 5 years ago | (#26181087)

What a horribly misleading title.

Ph.d candidate... is factual and much less sensationalized.

this has limited application (1)

cats-paw (34890) | more than 5 years ago | (#26181089)

Here's the idea:

Generally speaking you generate the signal (using an oscillator) then run it through an amplifier and filter before it goes through the antenna. Each of those stages consume power. The amplifier has an efficiency which means you get less power out in the signal than you put into the amplifier for operation, and the filter has loss.

The idea here is that the signal generation using the VCO (voltage controlled oscillator) is combined with the filtering and the antenna, in essence, as "one step".

However this is going to have a very narrow range of application. There are many reasons why a VCO needs buffering and isolation from the outside world. There are many cases, especially for complex modulation, where coupling the VCO like this more closely to the outside world will degrade it's performance (yes, even if you are using a PLL) to an extent that you wouldn't be able to make a working radio with it.

So for certain low power, low complexity applications this does help, but for anything which needs a "real" radio it won't do much good. It's really more of a packaging gee-whiz.

tis a trap (0)

Anonymous Coward | more than 5 years ago | (#26181167)

they must have developed a method of intercepting these mini transmission.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?