The Original Gay Porn Community - Free Gay Movies and Photos, Gay Porn Site Reviews and Adult Gay Forums

  • Welcome To Just Us Boys - The World's Largest Gay Message Board Community

    In order to comply with recent US Supreme Court rulings regarding adult content, we will be making changes in the future to require that you log into your account to view adult content on the site.
    If you do not have an account, please register.
    REGISTER HERE - 100% FREE / We Will Never Sell Your Info

    PLEASE READ: To register, turn off your VPN (iPhone users- disable iCloud); you can re-enable the VPN after registration. You must maintain an active email address on your account: disposable email addresses cannot be used to register.

Apple Employees Denying Computer Virus

Status
Not open for further replies.
I would argue that in proportion, PCs still dominate in that regard. In professional environments, they are still cheaper and more practical. Companies and businesses are still interested in a certain bottom line. In professional environments, PCs are simply the better option. I think Macs are only for a niche group of the market... who prefer paying a premium for getting less.

Professionals should opt for more powerful machines, something which Macs simply aren't.

Horseshit.

You're talking about business, he's talking about media production. They're two different things.

Oh, and the MacBook Pro I'm typing this on would eat yours, or most anyone else's computer on here, alive. Your statement that macs are less powerful is not only not true, but is laughable. There's a reason why Macs are used for everything media and video related, and it's not because they're pretty.
 
LMAO. I'm not giving you anything. I've already been through this in other threads.

If you're going to post something as ridiculously stupid as you've posted on the previous page, you better have something to back it up. Seeing as how you don't, you owe us an apology.
 
^ Wow! Quite the tirade!

Take a breath, then prove whether you are posting fact or fiction. If Macs lack functionality, prove it - post a list of things your Windows machine does that my Mac does not. Note that everybody concedes Macs are not the right choice for gamers.
 
I've already proved it a dozen times. You're a riot you know that? You make me laugh. You keep denying the facts. Again, I've said enough.

No, you never have. But that's okay. Sometimes no words say more than many. :-) As is so often the case in these types of debates, zealotry is generally not backed by facts. (And that goes for both sides of any debate.)
 
What utter nonsense. Your refusal to support any of your statements shows you just can't back them up. You're no different to a Mac zealot who posts stuff like "OMFG, Mac is da bomb, Windows sux!"

Nothing to back up your ideological statements.
 
Wow, this thread died a rapid death! Keep it civil, boys, or Corny'll shut it down before you can say "Fanboy"! :-)

No need for name calling, we're only discussing computers, not politics!
 
Most movie studios run PCs based on Linux. But then again you wouldn't know that because all your responses are based on outdated notions.

Let's be clear here. It's true that the majority of Render Farm nodes in 3D film production houses run proprietary Linux software. Pixar, for example run RenderMan software on a 2000-node Intel CPU farm. ILM have a similar setup, also running Intel nodes.

But that has nothing to do with commercial computing. Those machines are custom built servers running custom built software. We're talking about commercially available computers that can be bought by any regular human or business.

In that respect, film and television companies predominantly use Macs. Half of all Adobe licenses sold are Mac licenses. It's true that, in the past decade, the Print industry has moved from a predominant Mac base to a more Windows-oriented environment, but not so in TV and Film. Quicktime is the predominant container worldwide for professional media distribution. Quicktime's ProRes codec is probably the most common world standard for video editing these days. When I make a TV commercial for international release, it is sent to every other country in the world as a Quicktime file.
 
Inadequate. By the way, thunderbolt is absolutely unpractical for reasons I don't want to get into right now. Yes, it's inferior. Sorry to break it to you. But my Phenom is better than that. You're the only one talking gibberish and total nonsense.

Actually, it sounds like you don't understand the potential for Thunderbolt. It's a whole new type of external data bus, unlike any USB or Firewire before it. It has 4 times te bus speed of USB3. It's a direct extension of the PCI Express architecture from the motherboard. Companies are already building Thunderbolt PCI Express docks, which allow you to insert regular PCI Express cards like a regular computer chassis - display cards, IO cards etc. So you can, for example, plug in a MacBook Pro to your dock with one connector, and from that you're instantly connected to additional display cards, video capture cards, fibre-channel network cards, as well as direct connection to multiple displays and external drives. It gives you the expansion capabilities of any high end desktop machine, with 20Gbit/s data throughput, over copper or fibre optic cable.

So it's disingenuous to compare Thunderbolt with other external bus systems like USB - it's a vastly more comprehensive and significant technology.

http://en.wikipedia.org/wiki/Thunderbolt_(interface)
 
No it doesn't. Want to know why? It's not practical. I won't even get into the reasons why. There are hardware limitations. Plus why settle for just that, when manufacturers won't even manufacture devices for it? LMAO. Owned..

Please provide some explanation for the "hardware limitations" of Thunderbolt technology.
 
AMD is gaining market share and fast. It's already been estimated they may control half of the market by 2012. Want to know why? Superior cost/performance ratios. Plus they settled a lucrative lawsuit against Intel... and gained a huge advantage with ATI.

Unfortunately, this is no longer true. Your earlier citation (woohoo - you cited some evidence for once - good for you!) predicting growth in AMD market share failed to happen. AMD market share has slumped in the first quarter of 2011, to 25% of the market versus Intel's 75%.
http://www.cpubenchmark.net/market_share.html

In the high end CPU market, like those used in Mac Pro desktop systems, AMD don't even really exist. The first AMD CPU is about 30 down the list based on performance benchmarks.
http://www.cpubenchmark.net/high_end_cpus.html

In value-for-money at the low end of system specs, AMD fair very well, however. AMD can be a great choice for those on a budget.
http://www.cpubenchmark.net/cpu_value_available.html

I think PCStats.com says it better than I can:

If you're building a new computer or buying a mainstream PC, a very nice Intel processor will only set you back $150-200US. For that, what you'll get will be faster than the AMD equivalent at that price. Factor in the cost of the other computer components, the fact that going AMD is no longer 'the cheaper option', and it really just makes sense to build Intel this time around.
http://www.pcstats.com/articleview.cfm?articleid=2164&page=2

The same article talks about the "ebb and flow" of CPU technology. This year, Intel's on top. Next year, AMD probably will be again. Competition drives innovation, and we all win.


But the CPU debate forces me to ask you a question, Giancarlo. A primary point of yours in our recent debate was that Window's greater market share was evidence that Windows was better. Yet, here you are touting the benefits of AMD, by far the smaller guy in CPU market share. Isn't this an admission by you that bigger is not always better? The innovators are not always the most popular? And something that suits the majority may not suit everybody?

Isn't this simple fact clearly demonstrating my point all along - that people have different needs, different wants, and make decisions that aren't necessarily wrong, just appropriate to their own requirements? You can't seem to fathom why someone would buy a Mac over a cheaper, better specced Windows machine, but by your own admission you prefer a CPU which benchmarks say isn't the best value for money.

Please, let's keep responses civil and controlled. I've never shown you any disrespect or called you a name (except Spec-boy once, but that was all in fun! :-) ). I hope you can offer the same courtesy. This topic is not worth getting upset about.
 
I see what you're saying, but no. It's not that difficult, and there is no obscure code..

Just to clarify what Johann was saying about "codes", on a Mac to create many common special characters, there are quick shortcuts, such as:

• Alt-8
ü alt-u
Ü alt-U
é alt e
® alt r
™ alt T

These are OS defined, so you don't need additional software to help out.

In Windows, you need to know the ASCII code of the character you want. You also need a numeric keypad, so you need to use a different method if you're using a keypad-less laptop. Process explained here:

Windows assigns a numeric code to different accented letters, other foreign characters and special mathematical symbols. For instance the code for lower case á is 0225, and the code for capital Á is 0193. The ALT key input is used to manually insert these letters and symbols by calling the numeric code assigned to them.

To use the codes:

Place your cursor in the location where you wish to insert a special character.
Activate the numeric key pad on the right of the keyboard by pressing Num Lock (upper right of keyboard). The Num Lock light on the keyboard will indicate that the numeric key pad is on.

NOTE: You must use the numeric key pad; if you use the number keys on the top of the keyboard, the characters will not appear. If you are on a laptop or computer without a separate numeric keypad one of the other methods is recommended.
While pressing down the ALT key, type the four-digit code on the numeric key pad at the right edge of the keyboard. The codes are "case sensitive." For instance, the code for lower-case á is ALT+0225, but capital Á is ALT+0193.

NOTE: If you have the International keyboard activated, you will only be able to input codes with the ALT key on the left side of the keyboard.
Release the ALT key. The character will appear when the ALT key is released.

NOTE: You must include the initial zero in the code. For example to insert á (0225) you must type ALT+0225, NOT ALT+225.

http://tlt.its.psu.edu/suggestions/international/accents/codealt.html

Easy, huh?

This is at the core of why I choose Macs. It's little things like this that define what I call "elegant simplicity". Little touches like a magnetic clip on a laptop cover that you can open with one finger; magnetic power connector so if you trip over the power cable your laptop doesn't end up on the floor (and you can plug it in the dark! :-)); instant wake from sleep, every time; and multi-touch trackpad with gestures, which is so far above and beyond Window's tacked-on touch support in Win 7.

It's these little, human things, that help make Macs worth the money to me. It's why Mac enthusiasts talk about experience more than specifications.
 
Guys guys ... I myself do enjoy the occasional flame war. Make the other's fav. OS look bad and badmouth the fanbois .. all fine :) But attacking each other on a personal level and name calling is something entirely different. If you can't particpate in a heated debate without playing dirty - step away from it.
 
Guys guys ... I myself do enjoy the occasional flame war. Make the other's fav. OS look bad and badmouth the fanbois .. all fine :) But attacking each other on a personal level and name calling is something entirely different. If you can't particpate in a heated debate without playing dirty - step away from it.
370407.gif


bu... bu... but whyyyyyyeeee?
 
You do know that Thunderbolt was by Intel right?

I'm not talking about the speed. That's irrelevant in this sense. It won't make into the market. Manufacturers are simply not interested in it.

http://www.crunchgear.com/2011/03/02/amd-belittles-thunderbolt-says-its-unnecessary/

“Existing standards offer remarkable connectivity and together far exceed the 10Gb/s peak bandwidth of Thunderbolt. These solutions meet and exceed the bandwidth utilization of many peripherals,” a spokesperson for AMD said.

---

Yes it's another example of the AMD v Intel war (because Thunderbolt is an INTEL INVENTION) , but the spokesperson has a very fine point there.

---

http://jdrch.posterous.com/why-intels-thunderbolt-tech-will-fail

So basically Thunderbolt takes you from having 1 central controller for all your devices in the USB case to having a controller for each one. This increases Thunderbolt's OEM peripheral production costs vs. USB, which has been capable of handling up to 127 devices per controller since its 1.0 release. Controller cost matters much more to a peripheral OEM who's selling a $200 device than to a PC OEM who's selling a $1K+ machine.

"What does this all mean? The port and controller costs will prevent Thunderbolt from appearing on non-high end devices for a long, long time. Perhaps even forever, since USB will always be cheaper and easier to implement, while offering more compatibility and comparable performance (assuming the USB 3.0 interface isn't a bottleneck). Of course, Intel is likely to do its best to limit USB 3.0's compatibility by dragging its heels on native chipset support. It won't be able to do so for long, though: USB 3.0 devices are pouring onto the market, and AMD has the tech on its chipset roadmap. Thunderbolt devices? Only 2 have been concretely announced, and neither of them are on store shelves."

-----

This is why in my eyes, from a strictly business standpoint, why Thunderbolt will be a total business failure. It's simply not practical, and OEM manufacturers continue to choose USB, and USB3 already has several devices out. Sorry.

It's all about having support from OEM manufacturers, which Thunderbolt simply won't have. Thus making it unpractical.

I'm sorry but I wasn't arguing about its speed. That wasn't my point. I was arguing about practicality, and that's why it'll fail. When all the manufacturers decide to support all the other standards out there, like USB3, the speed difference is irrelevant. And as what was said by AMD... existing standards already exceed the speed of Thunderbolt, hence why I think it'll be an incredible business failure. Another by Intel. They are taking NVidia down into the toilet right now.

Erm. You do know that Apple invented what is now known as Thunderbolt, don't you?

http://gigaom.com/apple/intels-light-peak-was-apples-idea/

The market for Thunderbolt is also a little different than USB, since its so much more powerful:

http://www.tuaw.com/2011/04/11/thunderbolt-peripherals-announced-at-nab-this-week/

(that backs up what myself and Andy have said, that Macs are the predominant computer in the video, music, and graphics industry)
 
I've no interest in a pissing contest between Apple and Intel - perhaps we could all just agree that Thunderbolt is a collaboration between the companies, as Intel says in the link above.

It's pretty ingenuous to compare Thunderbolt and USB3 - they're just different things. You're right, Thunderbolt will probably never have wide mainstream appeal; it's likely to remain a more niche tool for higher end applications, like FW800 is today.

For professional applications, Thunderbolt has some amazing implications. Imagine a live production with multiple HD video screens on stage. A single MacBook Pro could drive up to 6 displays on stage from a remote location, all via one tiny piece of fibre optic cable. That's huge for people in my industry.

It's no surprise that AMD flame Thunderbolt - as your own link states, they don't want anything to do with it. Firstly, because they'd have to license it from Intel; and secondly, because it's not really relevant to them. AMD are virtually non-existent in the high-performance end of the CPU market, so it's unlikely their market will require technology like Thunderbolt. In the low to mid range end of CPUs, where AMD are focussed, USB3 will be just fine.
 
GameOver, I believe Macs are better for beginners.

PCs, especially when troubleshooting, call for a higher level of expertise. And, let's face it—the average beginner doesn't fully understand the seriousness of the virus problem.


Uhh, this Apple 'beginner' is learning Terminal commands on the BSD Unix kernel... in OSX.
 
Hate to break it to you, but Apple doesn't invent anything. They only rely on third party manufacturers, such as Intel to do that.

http://www.intel.com/technology/io/thunderbolt/index.htm

What is Thunderbolt technology and how does it work

Developed by Intel (under the code name Light Peak), and brought to market with technical collaboration from Apple. Thunderbolt technology is a new, high-speed, dual-protocol I/O technology designed for performance, simplicity, and flexibility. This high-speed data transfer technology features the following:

------

This is something Intel invented, and only with some cooperation from Apple brought it to the market. It's not an invention by apple. While Intel will say it was Apple's idea, it was developed in the labs of Intel.

This was on Intel's website by the way. So I think I'll take over whatever source you come up with.

The market is very limited for Thunderbolt. Read my sources please. This is why Thunderbolt will be a business failure in my opinion.

Macs are not the predominant machines in Video, Music and graphics industry. Not at all. Time to get out of the late 1990s. Times change. Best not to be left behind.

I'll post this truthful statement once again for you:

Existing standards offer remarkable connectivity and together far exceed the 10Gb/s peak bandwidth of Thunderbolt. These solutions meet and exceed the bandwidth utilization of many peripherals,”

--------

This is true. Forget who said it. Try disputing the facts in this statement. Thunderbolt is actually limited technology, hence why it will fail. OEM manufacturers will continue to ignore Thunderbolt, and in addition there could be further revisions of USB 3.0 that push its bandwidth pass the 10gb/s peak bandwidth of thunderbolt. The reason for this is that USB 3.0 is far cheaper and makes more sense.

It's not actually more powerful then existing standards. In fact it's actually limited. Very limited. That's the intrinsic flaw in the technology.

Why won't you address the facts I post? When I bring up sources, I see the replies and just shake my head. I don't get a response.

I may have to just take some advice and start placing people on ignore. I really don't want to risk getting into another flame war about this again and losing my cool.

Ok seriously. Start reading the links people post. Apple invented it and handed it off to Intel for integration into Intel's boards. Apple invented it to replace Firewire, and to provide for a higher speed connection than anything USB was going to come out with.

You're also missing the point of why it was invented to begin with. (you skirt around it on the way to the point you're making, but don't see it) Its way more powerful than anything USB can muster, thanks to the fact that it can daisy chain multiple devices with no loss in bandwidth. The entire purpose of lightpeak isn't to just replace USB, its to replace EVERYTHING. It is a multi-purpose connector. Just imagine running a monitor, hard-drive, sound input, and high-def video input all out of one single connector with no loss in speed or signal. THAT is why its significant. (and its something USB 3 can't do) So while all of the existing standards together can offer more bandwidth, Thunderbolt can offer similar bandwidth in a single port.

Its no more limited than any other I/O. In fact, it is LESS limited because of what it is capable of. Sony has already adopted it. Samsung has adopted it. HP will adopt it once more peripherals are on the market. Asus will as well.

Like it or not, its here to stay, and isn't going to be a 'failure'.
 
WRONG. I've already posted facts from Intel's website. They are the ones who claimed on their own website they developed it, and then had technical cooperation from Apple to market it.

I'm not going to get into this with you. You obviously refuse to read my links, and you refuse to read anything anyone who disagrees with you post.

It's a failure from both the practical and business sense of view. It will not be a lasting technology. And one other thing, no it doesn't offer just as much bandwidth in a single port. The other I/O formats offer vastly more bandwidth.

Your constant refusal to read the facts and my links has led me with no other option.

You're now on my ignore list. Have a good day.

Refuse to read your links? I read your links. You didn't read mine.

Apple invented it and handed it off to intel. Just like Apple trademarked the name and transferred it to them.

You just don't like it when you're wrong. And you are most certainly wrong. But have fun with your AMD board. I'll enjoy daisy-chaining 2 monitors, an external hard disk and some other peripherals with no hit in bandwidth. ..|
 
The most recent on the GPU market, which the supposed multimedia experts should know about. AMD is poised to gain significant control, and AMD's top line processors are faster than Intels, in combination with ATI advantage. AMD simply isn't a low end company as you want to paint it as. Well that's to your own disadvantage.

http://www.gpu-wars.com/2011/05/nvidia-market-share-slide-confirmed-by.html

(BTW ATI is referred simply as AMD in the above, because ATI is being phased out as a name as AMD centralizes itself)

Please you guys... never cease to amaze me.

It's looking really bad for Nvidia right now. Another one of Intel's failed ventures.

I'll enjoy my vastly superior performance on my Phenom X6 core computer which smokes Intel. Have fun on your simple K-12 computers. :) Enjoy :) I'll enjoy my vastly superior ATI GPU also... which smokes Nvidia.

With an attitude like this, why bother talking to you? It's like talking to a kindergartner.
 
Status
Not open for further replies.
Back
Top