Fiberoptics and our future!

Status
Not open for further replies.
Joined
May 19, 2013
Messages
472
Location
IL
Is anyone else looking into the not-so-distant future and getting excited that before we know it, our processor speeds will increase from the GHz range to the THz range!? Call me a nut, but I would love to be part of the R&D team that creates the ability of fiber optic transistors small enough to be made use of in a microprocessor!

Fiber optics are our friend and our future! Thank god I will soon have my BA in EET!
 
My Dad worked at Bell Labs in the 1960s and he remembers walking by someone's desk who had a bunch of fibers (for fiber optic) and people would comment "we ought to find a use for them".

They are close to maxing out the speed for current processor technology. Thus going to multiple cores. That sounds great, but it depends upon workload. Very difficult to take a single unit of work and spread it across multiple cores.
 
^ That's the fantastic part about FO. The bandwidth is so large that for use in our personal computers no person would come close to using a FO processor to its full potential, at least under today's usage. Put the amount of BW like that in a single computer and wammo, never another need for an upgrade lol.
 
Originally Posted By: lugNutz
Call me a nut, but I would love to be part of the R&D team that creates the ability of fiber optic transistors small enough to be made use of in a microprocessor!

That's about the least nutty thing I've read on BITOG in a long time.

These kinds of breakthroughs are all about huge numbers of people building on each other's work. Someone or some team gets the credit but it takes a lot people to get them to that point. So, I'd say there's plenty of room for you on that wagon. Jump on! Don't stop at your BA -- go for an advanced degree and join a lab!
 
Originally Posted By: lugNutz
^ That's the fantastic part about FO. The bandwidth is so large that for use in our personal computers no person would come close to using a FO processor to its full potential, at least under today's usage. Put the amount of BW like that in a single computer and wammo, never another need for an upgrade lol.


If I only had a nickel for every time I've heard that since the 1980's. You'll never use all that memory, you'll never fill a 20mb hard drive, 1 ghz is so fast, you won't need to upgrade for 10 years... yadda, yadda, yadda. If they build it, programmers will write more bloated code that will need it.
 
...or there will be better features.

Bloated as it is, a lot of modern software is WAY more functional and good to use than its predecessors.

Also, before we assign too much blame to programmers, let's keep in mind that a great deal of code bloat nowadays comes from compilers. It has been many years since programming in machine language was feasible...
 
Originally Posted By: d00df00d
Don't stop at your BA -- go for an advanced degree and join a lab!

That's my plan. Ultimately come Feb/Mar I will be in the job market and one of the most important aspects will be tuition reimbursement. I plan to move on to my MBA in something that will actually pertain to my career depending on where i'm employed.

I would also like to note that regardless of how bloated code is, when it comes to FO, theoretically speaking, you wouldn't notice a difference. Think about the fact that we are data linked to Europe (as an example) by Fiber. Without it, we would still be sending letters by boat instead of email, skype, or whatever method of communication one prefers. The entire world is networked together by wires that we don't see (unless you work on repairing those lines of course). So in short, if the ENTIRE WORLD can communicate over multiple connections simultaneously, there's no code that will be written that will slow down FO. Unless i'm wrong here of course.
smile.gif
 
The idea of the optical transistor and subsequent CPU core technology has been a bit of a buzz for a while now and I've been following it somewhat infrequently. I believe it will certainly change the future of computing, as it opens the door for so much increased processing power.

Will be interesting to see where this goes.
 
I'm sure optical technologies will be the mainstay of data transmission for a while yet, but I'm interested to see whether optical computing will beat quantum computing to prime time...
 
Originally Posted By: OVERKILL
The idea of the optical transistor and subsequent CPU core technology has been a bit of a buzz for a while now and I've been following it somewhat infrequently. I believe it will certainly change the future of computing, as it opens the door for so much increased processing power.

Will be interesting to see where this goes.


Originally Posted By: d00df00d
I'm sure optical technologies will be the mainstay of data transmission for a while yet, but I'm interested to see whether optical computing will beat quantum computing to prime time...


We shall see soon enough!
36.gif
 
The limitation of the processor frequency is due to the skin effect of the wire, and the leakage on the transistor gate. It just doesn't make too much sense going above 4GHz if you want good power consumption and heat. It is also very hard to make use of more than 4 cores, so I think the future is about energy efficiency and discrete circuits that help without doing all these work in software.

Things like DSP and signal processing, graphic cores, etc off loading from the main CPU, and make 10 hours battery life on a tablet possible. This is where the future seems to be heading, and people start going back to the basic and demand software not to waste processor, and cut down processor for power / battery life. This is why Flash is in trouble (the reason why iOS do not run Flash, it is the power consumption).

Regarding to fiber: everything is already running fiber except the last mile, from your local lawn fridge of your phone company to you, from your local cable provider's terminal to you, and from the cell tower to you. We may gradually see them moving toward fiber for the last mile but it would take a while, at a good cost, if you are willing to pay $50/month for a couple decade. I'm not sure if everyone will pay for it, and at least in the US there will be cheaper alternative (i.e. lawn fridge to your home via microcell LTE that covers only 1 block), or the monopoly would just cap your download to save expansion cost.

Optical processor was talked about for a decade already, when I was in school. It didn't go anywhere because no one wants to pay for the infrastructure to make them (FABS), and it is much harder than they original though, and electronic based transistor is getting cheaper and cheaper, and still is good enough.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom