Ceres, Vesta, NASA, Mars, Jupiter and far far away shit

Between Mars and Jupiter there is an asteroid belt, remnants of a former planet.
The bigger chunks in the asteroid belt are Vesta and Ceres.

The bigger one is Ceres. And it being blue and brown certainly makes it more interesting that watching just a rock. So you would think NASA has explored Ceres and has high resolution images. Because the blue mainly.

But then, the reality shows you that NASA has only this shit from Ceres (using Hubble):

And this glorious much better from Vesta (greyscale at last, not that ugly color images from Hubble):

Also, and because I know people likes much better greyscale in low quality I will convert the ugly color image from Ceres to glorious low quality greyscale image to appreciate better as said to me in other thread:

Enter in google “Hubble Moon” and you will see:

  1. Greyscale images of the Moon
  2. Color images of the Moon
  3. Modified color images of the Moon

And decide what you like more.

The orbit of the Moon.

Imagine you have a sphere orbiting around a center. It is free to do every type of movement.
Now imagine the sphere is “egg shaped”. Nature creates egg to avoid them scaping. If you place an egg on a table and gives it an impulse you will see instead scaping from the table it will orbit around a center. One of the “sides” of this egg-moon will face always the center of the orbit.
Well this is exactly what happens with the Moon. But instead of having a geometrical shape of “egg” it is instead a gravitation egg shape. The moon has some zones of extreme high gravity (MASCONS = Mass Concentration) and they are almost all in the side we see. So its movement is not what science tells you about “a great coincidence where the Moon rotates one time exactly as it orbits one time too”. That is absolute bullshit. What happens is that the gravitational “egg shaped” moon, same as an egg does, will only show one side while orbiting. The other side of the egg is never seen. The Moon has most of their mass on one side. Why the bullshit is kept giving to people if they in NASA really know this?

I’m pretty sure we only see one side of the moon from Earth because the moon itself is tidally locked, meaning it does not rotate on its own around an axis.

Also, I don’t buy the idea that most of the Moon’s gravity would be clumped onto just one side (so you believe an astronaut would just jump into space on the other side then?), the entire concept of gravity revolves around it being a force is pulls objects towards the center of mass in a volume, which in all of the known cases in the Solar system is in the core of the celestial body. Also, it doesn’t make sense that the Moon would be a lot more massive on the side we see because there’s a lot more in the way of mountains, ridges, and valleys on the far side. Now it is true that information from new satellites like GRACE has indeed confirmed that gravity on Earth is indeed uneven, but the difference is way too small to be felt by man or animals.

I know you would probably disagree with this (because the so-called ‘alternative theory’ is the only thing you believe in anyway), but I can’t just see this lie unchallenged.

The Hubble Telescope
In 1997 or so the Hubble was placed on orbit. What cameras it has, what resolution and what nanometers?
First to remember the light spectrum nanometers is:
from roughly 700 (red) to roughly 400 (violet) is what we would call a color camera.
Hubble has these cameras:

NIR: Sees from 1700nm to 850 nm and is a 1024x1024 in resolution
UVIS: From 200 to 1000 nm (so this one sees in our dear color) and has resolution of 4096x4096 (this in 1997. I didn’t even dreamed in a 16megapixel camera on that time that Hubble has)

NUV: From 170 to 320 (above the violet, great to see UFOs for example). Resolution of 1024x1024
FUV: From 115-205 in an amazing sensor of 32768 x 1024 (32megapixels!!! in 1997)

Well, that is obsolete technology from the 90’s.
We now in 2012 are using 1 megapixel camera in a rover in Mars. And building a webpage explaining how it is absolutely great, debunking the conspiranoics pointing that is absolute BULLSHIT !!!

Because the MASCONS. Google to see a computer image of them by our dear disinfo agency NASA. Is all about the “egg thing” but gravitatory.

No. I didn’t said such. I say that because one side has more gravity (because has more mass, because the MASCONS there) then it equals to a egg-shaped effect as you see in an egg on a table. If you move it you see it goes orbiting around a center in the table and one side of the egg is always facing that center of the orbit. The same happens with the Moon but the “egg shape” is the gravitation shape of the Moon. It has more gravitation on the side facing us so it was rotating around but probably after only some thousand years or perhaps less it become locked as we see it is now.

Is not massive. Only the sufficient to do the one side always facing Earth. So is a massive effect but not massive in quantity. NASA scientist say that if an astronaut would be in the limit of a MASCON and has a bar it would be leaning half a degree or a degree I don’t remember the exact number. I think you will find this if you read about the MASCONS on the Moon.

The strange case of Jupiter NASA photos.

Do you like to see images of planets? I love them. And I love them in color. I understand the majority of people I discuss with love greyscale. But I am one of those rare people that just love color. That is the way my faulty brain is: is wired with color hunger. Then I also recognize people just love blurred images and with low quality. But if NASA has a 32 megapixel camera or a 16 megapixel camera and the ability to zoom because that is what “telescope” means. Then for God’s sake: zoom and use the 4096x4096 color camera!
And that is exactly what NASA does. They put a webpage where they post the amazingly images of the planets. So, for example I would love to have the biggest image of Jupiter NASA Hubble is able to capture. I just want a composite or whatever the bigger image NASA has (using Hubble only, I don’t want Cassini images or whatever, we are talking in this post only of Hubble images).
The page of the Solar System using Hubble is this one:

And the bigger Jupiter image I was able to find was this one:

Disappointing because I was expecting to see the whole planet and in more quality. Seems they have problems because showing only this “frame” means that they are not able to capture several shoots and then make a composite to have a glorious bigger image that shows Jupiter in astonishingly quality and completely full. Seems they just launched a shoot and good luck we caught a bit of the planet. Sorry guys, if you want more of the planet we will need to zoom out so we can get it complete. The capturing of several images and compositing is not something we can do. That seems to be the message. So we are looking to Jupiter in the most zoom they can do. That is the best quality we have when looking in the absolutely bigger and near Jupiter.
Strange because NASA then shows glorious images of very far galaxies with more detail than this one of Jupiter. Also because we have other shoots of Jupiter like this one:

where much more detail is seen.
So, that is why I title this post the strange case of Jupiter NASA shootings. Because you never know if they have more quality why they don’t show us Jupiter in the bigger quality they have. Imagine to look to Jupiter in a 16000x16000 photo for example! I would love that! Well, they just don’t want to do such.

For example this one: Seems they really have a lot of zoom eh? Why not aim this zoom to Jupiter and make a composite and post the best Jupiter panorama ever?

Go to the Jupiter image and place there this image with the size it has and you know why NASA is just bullshit and lies. Why we are feed with low quality for the near planets and feed with high quality for the far things?
Point Hubble to every planet and make astonishing panoramas. And I want the Moon first.

So we are going to compare Mars better high quality image from Hubble, same of Jupiter and same of some far far away shit they put on the hubble page:

We can see the zoom used can’t be the same, because the far away shit is almost same size as Jupiter.


As I posted, the red spot on Jupiter in these images has much more quality:

So I wonder why are they using all the zoom they have on far far away shit, but always much less quality on Mars, Moon, Jupiter, etc. Seems like the most interesting things are giving the less zoom. Are they perhaps paying per zoom used? So if you uses more zoom you have to pay more or something? Because if not it is just not understable that you uses low quality for Mars and Jupiter. I focus on quality for Jupiter because once you know how many pixels can they give you a image of Jupiter then is a simple math to calculate the quality Mars must to have. And that is my goal: To have Mars in more quality. That is the important thing. And then comes Ceres, and Pandora box opens fullly.

Bao2, What kind of weird shit goes on in your head !!

And we know already the resolution of the cameras of the Hubble. How his glorious color camera is 16megapixels, how other is 32 megapixels and how they have others with 1 megapixel and such.

Well, what of the four cameras would you use to take images of the Moon and what resolution you would use? We are going to see!
Imagine for example you want to shoot the Copernicus crater. NASA points the Hubble and takes this composite:

The curious of this image is that it is pure greyscale. The three values for R,G and B components are exactly the same. This only happens if you uses greyscale. Shooting the Moon in color will not give you this. They forced the camera to work in greyscale or they changed from color to greyscale when giving the tax payer this image. They don’t like to give Moon shoots in color. Wonder why.

(if you doubt that is Hubble just look to the page of that photo:

Well. You know what a composite of images is: the camera shoots an image and another and another and then you open gimp and join all of them. Looking to the composite we can say if the sensor was a rectangle (the 32megapixel sensor) or a square (16megapixel sensor in color). Look to my analysis:

First image I put the red square where I believe the first image was taken. Then I move that red square to the place where second image was taken, then third and fourth. The fourth image already touches the copernicus crater and the Artificial Intelligence software clearly detects that and immediately makes a 2x zoom and you can see if the 2x zoom were not done the five shoot in red square would be taken but that image was not taken. Instead the green one was taken and then the AI software moved to take others with the new green square around the Copernicus crater.
Is obvious is AI software because if it were manually done it would not present the green square so far away of the subject, nor would start shooting where first shoot was done also. It seems an automated task.

Now, the sensor used is square so my guess is they are using the 4k x 4k = 16megapixel color camera. The first red square would have 4k pixels and the green also would have 4k pixels. To make the composite you need to resample the red square shoots to 2x so they have 8k pixels each image, so your green 4k can match in the composite.

Well, if you measure how many pixels the NASA image has, it has 1626 pixels for the red square side and 813 for the green square (pixel above pixel below I am not being too precise here). That means they are not using the 1 megapixel camera but the 4k x 4k = 16 megapixel camera. But instead giving us 4k pixels on the green square they are giving us 0.8k pixels, exactly 5 times less resolution than the camera gives them.

And that is a demonstration NASA don’t likes high quality images or a demonstration NASA fools you everytime they post an image.

For that image the page says “You are attempting to access an image with an extremely high resolution. While the file size may be small, the number of pixels this image contains requires at least 10.48 MB of free RAM that is not being used by any other application, including your operating system.”

so they warn you a high image is going to occupy 10 megas of RAM and your computer is going to explode.
Give me a break.

I hope confirmation of they having the original 5x more quality and post for me to see Copernicus as God or some rock created it some day when the deception ends.

Richard, this thread is about camera resolutions and if NASA is using the full zoom.
Probably of interest to someone. All the newbies have problems understanding resolutions. For example you know those guys that go with a 4 gigas image to print a 1meter x 2 meter poster…
I hope this post clarifies resolutions and such.

And obviously I would love not see Matrix angry and be able some day to post about the Apollo movies. Great for animators, they will understand frame rates and such.

Mars and Hubble are not friends.
Basically the problem is that the AI in Hubble don’t understand Mars colors. They try and try and each time a photo is taken the colors are different. I will post the extremes, the in between images you can see clicking on the little thumbnails to the right on these links.
Extremely reddish (so that even the clouds are yellow, probably Hubble was thinking of sulfuric clouds or something…):

Heavenly attractive: Seems the clouds have the right color, the atmosphere shows the same as amateur astronomers around the world are posting. Perhaps the right thing:

I like NASA decision of using the color camera in Mars images. I would not agree with them going greyscale only as they do with Moon. But even when they do it shows also good clouds and atmosphere certainly:

A shoot of Jupiter’s moon Io makes you wonder why that resolution is not used to make a incredible composite image of Jupiter and then of course of my dear Mars.

Seems NASA only uses big zooms for far away or little things. When the things are big lower resolutions are being used. They think of the users don’t having 10 megas of RAM basically. ESA does the same so is fine.

Perhaps you wonder why the 32 megapixel camera is working in ultraviolet?
Because ultraviolet is where the Universe works. For example the interesting video with the tether and the “ice chunks” as NASA explain, is using ultraviolet.

This is Saturn in ultraviolet. I would love to see Earth in ultraviolet. But after the tether incident seems the ultraviolet thing is closen, at least with Earth.


Once you know the size of the sensor Hubble has you look to those images they post:

And this one has 500 pixels height. That means 8x smaller than the original image using the color sensor. 8x smaller means 64 times less pixels you see. You see one pixel of each 64 the original image has. Why? Is that what people wants? I really would like to see Saturn in full glory. Imagine if this little chunk is 16 megapixel image what incredible Saturn they could post, and don’t post.

That image is the “highest quality image” as they say, and from this page:
you can download lower quality and do the greyscale conversion if you desire…

Just to be accurate:
In post 5 I said the 16 megapixel sensors and the 32 megapixel sensors were there in 1997. That is not correct. As the link in post 5 show, that was added to the Hubble in Service Mission 4 that happened in 2009. So we have 16 megapixels and 32 megapixels only from 2009 to current day. Before it had less resolution.

The rest of things posted continue to be true: NASA has lot more resolution that they post to the people. They actually resample the images as we have seen 5x and 8x lower than the original.

Bao2: You do realize that you are able to edit your posts, don’t you? I find it a bit annoying that you make 6 posts in your own thread within 1.5 h.

What was the latest Guinnes record?

ArtBell comes back. He was oblied to stop or else… Matrix has changed and now he is back.
One of his guest is Richard Hoagland

and this is a “former” consultant of NASA saying stupidities to make everyone look stupid. Better I post about him than someone pointing it some day. You can’t be a “former” military worker. When you work for a military/intelligence angency, you do it for life. If you want blow up claims what better than send and promote a guy that says that the ripples in sand are ancient canals? This “make it so irrisory that everyone will feel shame” is a known trick to blow up the “enemies”. Only when the “enemies” say who is who that thing then doesn’t work.

If Richard sees “structures” in the Moon sky, perhaps he is looking at the “hollywood” set (Australian).

Ahh I see now, you listen to Art Bell, that explains a lot.:slight_smile: You should try listening to the Alex Jones show to, he has a lot of the same kind of views as Art. But my personal favorite is just walking up to a random homeless person and listening to them talk for about an hour or so. I get tons of info from them!:smiley: