Computers & Technology

They know that consumers won’t take it, that’s why they work so hard (ahem Comcast) to have their local monopolies. That way, the consumer is forced to take it as there is no alternative…unless slow DSL and dial-up count. Here in the FiOS oasis, those who elect to get Comcast quickly have the Verizon guys outside of their house as they realize their mistake right quick, though FiOS itself is rather brutal too.

I think that sums it up.

Amazon finally puts its streaming video service onto Android, including Prime.

I think my BD drive is starting to die.
It reads CDs & DVDs fine, but for a large portion of my BDs, they don’t even register as being in the drive for the past few weeks, even some I’ve watched before.

I popped in Kinmoza today and had to eject/close the drive something like 40 times before it would actually register as being in the drive.

That’s brutal, and odd. This is a stand-alone BD player, right? That does lead me to suspect that the DRM tech in the player is “being special” and is the cause of this.

…After the anti-copy tech of DVD was cracked with hilarious ease, I’m still shocked at the hubris of Sony (et al) to include more anti-copy tech in BD; if it hadn’t been for that garbage blocking me from playing my legitimate BDs, I wouldn’t know now how to rip a BD :wink:

It came in my PC when I bought it, so it’s about 5 years old now.
I’m debating on whether to buy a lens cleaner and see if the lens is just dirty, since BD lenses are more sensitive than CD/DVD lenses, or just taking it in and having someone knowledgeable troubleshoot and fix it for me.

I have my doubts that 4K will catch on with the mainstream. We don’t even have 1080p broadcast yet and bluray still bows out past 1080p30 so there’s no point to 4K for TV watching. More importantly, 4K can’t be done at more than 30 FPS over the current HDMI standard.

If they’d put the time into it, even the XBone has the graphical power to push 1080p60. The Xbone’s GPU is dated, but it still comes from a time when 1080 didn’t even cause the tech to break a sweat.

Besides, 4K is already out of date. It’s all about 5K now. :slight_smile:

Hasn’t the edition wars on console generations already hit the need for its 401k yet?

I guess it has, in general, but this new war with consoles not being able to drive the displays connected to them is still new to the workplace :slight_smile:

Somehow that complaint didn’t come up in the 90s when only a few N64 games could push the CRTs of the day to 640*480 (and, IIRC, they weren’t made by Nintendo) and it only seemed to come about in the time of the PS3 and 360.

I don’t really agree that the modern consoles can’t drive 1080 as I have much older hardware lying around that can, and this is tech that was lower grade even in its heyday.

The simple fact a console that shouldn’t be as powerful (Wii U) can pump out 1080p while the XBO requires an insane amount of work (or very little for the CPU to do) is a bit insane.

I got a $35 computer that might be able to play MineCraft at 1080p! (I haven’t tried it)

I hope that Microsoft can clean up Minecraft as I hear that Mojang/notch have been infamously lazy about optimizing the thing, making it capable of bogging down in some instances on even the beefiest of machines. I wonder if, as things stand, the rather weak Windows phones could really play the game.

With the consoles, everyone has the exact same hardware and the software is tailored to the consoles and can be made “close to the metal” to wring out every drop of performance but yet they can’t make it to 1080 while my ancient AMD quad-core and Radeon 6850 laugh at 1080 and are supposedly less powerful than the hardware in these consoles.

That’s just ridiculous.

Japan Moves Forward to Air 8K Broadcasts by 2018

posted on 2014-09-15 23:57 EDT

And to me, everything still looks the same as if it was all still 640x480. MAybe they are developing the wrong tech, maybe they should make tech that tells you that you need to clean your screen so you can see all the vibrant colors instead of trying to make more of them strong enough to break through all the filth.

C’mon, it has to look at least slightly different than VGA (640*480) as Full HD is significantly wider.

It’s almost scary that Japan is going to start testing 8K in ~2 years and will begin testing 4K broadcasting in only a few months as we still can’t broadcast higher than 1080p30 here in the US and bluray can’t go higher than that either yet.

On a side note, this is why I lament that so many movies and shows are filmed digitally as they’re filmed at most in 4K, generally in ~2K, and they will have to be upscaled for our near-future ~4K TVs whereas I’ve heard film to be ~5K native, meaning that a film movie from 30-40 years ago could look better on a 4K TV than a modern TV show or movie.

I wonder what resolution the Star Wars prequel trilogy was shot in, since I believe that was shot 100% digitally?

Mope, with my eyes I could never see a full horizontal width screen, thus why i prefer vertical screens and why I never go to theaters. Why pay for something I cannot fully use? Colors, depth, everything looks the same. no change in brightness, vibrancy, or that other nonsense people speak of to try to sell the new gimmick.

there is one difference, I cans see less since the size of the comparable price screen to replace my 43’ diagonal is not as tall so it is like i downgraded, and all because it had to fit the place for the width of the old 34’ diagonal TV, so I have gained parts on the side I cannot see, and lost parts vertically. I traded 34’ for basically 24’ viewing. :dry:

also the HD signal means that TAN VOD with its yellow subtitles, makes things harder to read because the colors are washed out and faded, and when pausing like on a CRT, the subtitles are just completely blurred. Thus why I have been watching things on CR because the tech behind viewing in a flash player doesn’t cause the subtitles to blur when you pause a program to read them, AND they are white which contrasts well with EVERY program unless they are in the arctic, which very few ever are. and then it still has ye olde cartoon hard black line outlines to separate the text from the program to make sure it ALWAYS contrasts the program itself which means even LESS needing to pause since it isn’t yellow which is the hardest color to read text as since it emulates sunlight which our eyes had learned to avoid direct looking at.

the aspect ratio isn’t what matters as much as the resolution (number of pixels).

16:9 is HD right? 1080p?
4:3 is standard which equates to 16:12, which means standard TV has a better ratio, it was just too few on pixels, ie: 640x480.

things looks better on my VGA monitors at 1024x768 (4:3) than either CRT TV or HD TV because there is just more room or better management of pixels.

guess they didn’t realize that colorblind people would not benefit from this, or they did and didnt care because they could just charge the costs anyway because all the other suckers would jump on the gimmicks like a car salesman selling someone a Hornet or Corvair.

Personally, I like the wider format for media viewing as, IMO, too many modern works zoom the camera in too close for 4:3, especially in action sequences. I can’t recall if they started doing this prior to 16:9 being standard or not but when I watch “classic” TV programs (ie:MeTV) I notice that they back the camera up for shots instead of being too close.

Technically you don’t need a 16:9 TV as TV programs (excepting sports) are shot with all the important stuff being in the 4:3 “square” but that doesn’t stop what I’ll term the claustrophobia watching on 4:3 causes. 16:9 also has the advantage of reducing the letterboxing when watching movies.

My old 1080 monitor had weak contrast, weird brightness and poor off-angle viewing. It also looked “smeared”. I discovered that this was due to cheap LCD production (TN display) and the new LCD that I have, based on IPS technology, eliminated those issues. My 1080 HD TV is of higher quality than my older 1080 TN monitor but the TV, also being TN, has similar visual issues (albeit to a lesser extent).

It could also be said that 4:3 is 12:9.

Are these computers at Funimation still running Windows XP?

A lot of companies continue to use XP. Usually, it’s because they use some kind of custom software that only works with XP, so upgrading isn’t an option.

Not sure why Funimation would still be using it though. :huh:

IF they are I will take 3 of them!

In the full video of the Funimation tour it looks like the old screensaver from stock XP but it is too blurry to see it clearly. Sadly it was one of the few non-Mac computers shown :frowning:

Personally, I think that the Surface is underpowered (Only 4 GB of RAM in 2014?!) but I like that M$ is marketing it (exclusively) to the artistic types as maybe it’ll break the mac stranglehold on that market.

An XP-era computer could be brought to its knees by even a script-heavy website. Just too slow for the modern world and largely single core (shudder) !

for starters it isnt hard to block all those stupid scripts, and not every computer needs to connect to the internet. Standalone machines are for productivity, not social media.

Again it dumbfounds me how people think every electronic device needs to be connected to some network before they can do anything. :huh: