top
### Is D an Underrated Programming Language?

All totally true. I'd still like it to be there -- or some other way of enforcing the existence of a virtual destructor baked into the language rather than hand-rolled -- but they're not going to introduce it.

top
### Is D an Underrated Programming Language?

At least these days every compiler supports #pragma once which cuts down the noise of the guards to a single line. Not ideal - I agree we shouldn't need this shit anymore - but it's a lot less annoying than it used to be. Still non-standard though of course, just that so far as I know at least gcc, clang and MSVC all support it. I don't use any other C++ compilers so I can't speak for anything else.

One thing we could very much do with in C++ -- an interface class. C++ abstract base classes are fine, but they come with one drawback, which is that there's no standardised way of enforcing a virtual destructor. I'm fine with putting in a virtual destructor in every base class but again, it's just noise. There's no need for it now. I should be able to declare

interface Interface

{

void Method();

}

and have that exactly equivalent to

class Interface

{

public:

virtual void Method() = 0;

virtual ~Interface(){}

}

and have

interface Interface

{

void Method();

protected:

~Interface(){}

}

equivalent to

class Interface

{

public:

virtual void Method() = 0;

protected:

~Interface(){}

}

Not difficult to do, but it's not in the standard. It would cut down noise for those of us who make extensive use of interfaces and reduce the risk of accidental memory leaks quite significantly.

top
### How Galaxies Are Disappearing From Our Universe

It's impossible to say for certain at the minute since we can't prove anything, but my guess would be both -- we don't understand what can generate a metric on cosmological scales (in the sense of how it's composed of billions of billions of metrics that are best modelled by Schwarschild or Kerr-Newman), but if we did understand how to do so it would most likely by necessity still be a phenomenological description. The ideal would be that we'd end up with a situation similar to that of thermodynamics, which is an emergent theory and a phenomenological description, based ultimately via statistical mechanics on small-scale physics. Without that, we'll be as thermodynamics was in the 19th century, with a phenomenological description but no convincing way of demonstrating its validity.

top
### How Galaxies Are Disappearing From Our Universe

A bit of both an objection to FLRW itself, and to dust solutions. On a fundamental level, the FLRW solution can never be more than an approximation to the real universe until we find a way of mapping the small scale physics up to a universal scale - since no well-defined averaging procedure exists (regardless of whether we're doing 3d averaging, 4d averaging, or some kind of statistical averaging) that does this with any rigour and generality, we can't actually ever state that the universe is FLRW "on average". On a more practical level, I don't think anyone would seriously question the applicability of FLRW+perturbations in the radiation-dominated universe, or indeed in the matter-dominated universe up to a pretty late redshift (arbitrarily, somewhere between z=5 and z=1). I certainly wouldn't, myself, and have done quite a bit of work on perturbation theory in the early (ie pre-CMB) universe without feeling the slightest twinge. Where it gets a bit more dubious is when those perturbations actually begin to grow, and in particular when the linear perturbations grow through 1 (ie when delta = delta rho/rho >=1) at which point the expansion is very definitely long since dead, and also when the *second-order* perturbations grow through the first order, at which point the expansion is also basically dead. This happens in the relatively recent, and therefore dust-dominated, universe. It's one reason averaging became so in vogue is that if we could find an effective dark energy, we'd solve both the cosmological constant problem *and* the coincidence problem in one go -- the dark energy is dominant now exactly because we live when there are massive inhomogeneities in the universe.

Of course, it hasn't worked, but that doesn't mean it isn't still an issue - in particular that we simply do not know how to write this theory down, so all we're working with is (extremely successful) phenomenology.

FLRW is unarguably a subcase of LTB; I'm not sure anyone would disagree with that. (LTB is also a subcase of Szekeres, while Minkowski space is a subcase of FLRW with constant scale factor. Minkowski is also a subcase of Schwarzschild with a vanishing mass, etc. There are reams of inhomogeneous solutions that are related in some pretty convoluted ways; "Inhomogeneous Cosmological Models" by Krasinski attempts to at least catalogue them. It's not the most readable of books, but he did try and make it comprehensive.)

top
### How Galaxies Are Disappearing From Our Universe

I may not have explained it very clearly. The point is that while near to Earth it's obviously in a "special place" -- no other planet in the entire universe has exactly the same conditions around it, in a relatively sparse arm relatively distant from the centre of a relatively large spiral galaxy in a relatively small galaxy cluster that's on the outer edge of a supercluster -- but that if you zoom out a bit and look at things on average, on scales roughly around a megaparsec in scale and above (which is the approximate size of a galaxy cluster, something like 100 times larger than our galaxy), it all begins to look eerily similar. On larger scales (let's say around 50Mpc and upwards) it all turns into a similar-looking mush of little bubbles where everything is basically indistinguishable from everything else. Attempting to pin it down properly, this "homogeneity scale" appears to be at somewhere between 75 and 250Mpc or so.

That's the point -- that the Earth isn't in a special place in the universe in that where we are isn't marked out as anything special. In an average sense, picking a random spot in the universe will lead to a view indistinguishable from that we have from the Earth -- if you ignore local eccentricities such as stars, voids and mighty blasts of raw radiation from supermassive black holes in galactic centres. That is, the assumption is that from anywhere in the universe, if you ignore everything within perhaps a kiloparsec or so, it's all going to look very much of a muchness. In particular, the CMB is going to look basically the same, a featureless wash of radiation at a constant temperature, with little ripples of about one part in 10,000.

top
### How Galaxies Are Disappearing From Our Universe

Surprisingly, no. There are suggestions of an anisotropy in the Hubble rate in different halves of the sky, but the errors are too big for this to be significant. That's the main problem with doing anything of the sort -- the error bars on the observations are just too big until we get far enough away (as in, taking velocities from galaxies far enough away that there are loads of them) to beat them down by sheer power of numbers.

But what that could let us do is put a constraint on how far away a black hole of one form or another would have to be, since the level of anisotropy that would be introduced would be related to the distance, in one way or another -- for instance, in Randall-Sundrum braneworld models it would depend on what is known as the "dilaton" which is basically the distance between the branes but manifests on the branes themselves as another scalar field -- although such would obviously be model-dependent, meaning that the results for, say, a universe moving towards a hole in an RS-type model, may or may not be very different to those in a more sophisticated model in some other approximation to M theory. About the only thing I'd say in general is that *any* directionality is going to induce an anisotropy. If the directionality is subtle enough that it's drowned in noise of very local observations then the influence of the whole is itself going to be correspondingly minor. How minor I obviously can't say since I've not looked at it in anything more than speculation entirely unbacked by analysis, but I'd be stunned if it was going to introduce more than relatively small errors.

That doesn't say that it wouldn't be an interesting scenario (if of course it hasn't already been examined and as I said I'd be surprised if someone hasn't already looked at it, at least in the context of RS models), and it's also not to say whether or not the corresponding impacts may or may not be significant, but it would be a careful balancing act to keep something like a black brane far enough off to avoid anisotropies that aren't covered by the existing error bars and yet produce significant impacts. Not to say it can't be done, just it would take a bit of care.

top
### How Galaxies Are Disappearing From Our Universe

I can't actually imagine a setup that would lead to that in vanilla relativity, but even if we assume it could exist it would introduce strong anisotropies into the universe. The very nature of falling into a black hole introduces a directionality, which would immediately produce anisotropies that we don't observe. If the hole is on a rough order of magnitude in scale with the universe, this would be even more obvious since rather than just having a general directionality constant throughout the universe, we'd now have a directionality depending on space (and time), focused on the centre of the hole. This would leave some characteristic signatures on the CMB that aren't observed.

If you wanted instead to embed this in higher-dimensional theories - so the universe is, say, falling into a 7+1d hole - then frankly no-one can give a full answer since it's not a setup amenable to full analysis (but then, neither is the one I've been discussing in the previous paragraph). I'd imagine it's possible to get a setup that doesn't introduce such anisotropies in the dimensions we're observing. I'm thinking of a setup where for instance we're on a 3+1d brane and falling along a 5th dimension into a hole of some higher dimensionality, which extends infinitely (or as near as is sufficient to kill any anisotroies) parallel to the 3+1d brane. You might even be able to get a toy setup along these lines using something like the Randall-Sundrum models that were all the rage 15 years back -- these are composed of two 3+1d branes suspended in a 4+1d spacetime, parallel to one-another. If you make one brane entirely "black" then you'd have a setup with one brane on which a universe can live, separated along a 5th dimension from a black brane. I have genuinely no idea if anyone's looked at such a system, nor whether it can be realised in an RS model, but I wouldn't be surprised if someone's actually examined it. If not it would certainly be interesting to look at.

top
### How Galaxies Are Disappearing From Our Universe

Yes you are, and because you're not educated in the field.

"The article assumes that planet earth is the center of the universe"

No it doesn't. Cosmology does not assume that the Earth is in the centre of the universe. It assumes the exact opposite. It's even known as the "cosmological principle" -- and it's a fundamental axiom in cosmology. Without it we wouldn't have the model that we're talking about. Instead we'd have Lemaitre-Tolman-Bondi models, which are isotropic around the Earth but definitely not homogeneous.

Basically, building the cosmological model goes like this:

1) Observe the CMB. This is all around us, at 2.7K, and is absolutely the same in every direction. It is, in the jargon, isotropic around the Earth.

2) Assume that gravity on large scales is accurately modelled by a geometric theory of gravity (such as, but not restricted to, general relativity). We now know that on average the universe should be described by a metric that is at least isotropic about a point near to Earth.

3) Since this is obviously absurd, as you've picked up on, apply the cosmological principle. If the Earth is not in a special position in the universe, which it would be an astonishing act of hubris to assume it is, but the universe looks isotropic around the Earth, then there are only two choices. We can either dump the cosmological principle and assume the universe is centred on Earth -- which is... untenable, given the vast scale of the observations -- or we can assume that the universe looks isotropic around every point. This implies that it is homogeneous and isotropic: every point is the same in every way.

4) We can now tighten our previous assumption and assume that the universe is modelled by a metric that is isotropic around every point. That means that it is composed of what are known in the jargon as "maximally-symmetric" 3d surfaces. This leads us naturally and inevitably to the Friedman-Lemaitre-Robertson-Walker metrics, which give rise to the "big bang" theory you dislike so strongly.

There are obviously problems here. The phrase "on average" is used frequently and without rigour. That rigour cannot, as yet, be provided. We have assumed twice the nature of gravity - first that it is geometric in origin, and second that it is described by general relativity, which is basically the simplest geometric theory of gravity. Fitting to observation also leads us, naturally and inevitably -- unpleasantly so, if we're being honest -- to dark energy and dark matter. But there is a need to "create these terms", in that the theory demands them, and the theory is *astonishingly successful*. One of the main successes of FLRW cosmology is that it first predicted a characteristic wavelength of ripples on the cosmic microwave background, which was then observed (and which can be used to determine how much dark matter there is relative to normal matter), and that that same wavelength should also be imprinted on the large-scale distribution of galaxies. This was *also* observed, and is exactly where it was predicted by combining CMB and supernovae observations. This is amazing not least because the theory predicts the CMB forming when the universe is around 300,000 years old, while the large-scale distribution of galaxies is observed when the universe is pushing on a bit, probably around 10bn-12bn years old. The wavelength on the galaxy distribution is therefore extremely stretched compared to that seen on the CMB. And, as one might expect, the level to which it is stretched is extraordinarily sensitive to the cosmology - it doesn't take much of a change in the levels of matter, dark matter and dark energy to put it slap bang in the wrong place entirely.

Doing this unfortunately means we need to put dark energy in the model. Unsurprisingly, this isn't as ad-hoc as it seems, since there are multiple candidates for a dark energy, but it's still a bit unfortunate since not many of them are profoundly appealing. (Perhaps the most appealing is also the original, proposed by Wetterich in 1987, since he derived it with reasonable motivation from relatively rational particle physics. Unfortunately it doesn't, quite, work.)

Thankfully research is ongoing amongst people who understand the need to "create a term", in a variety of directions, from challenging the fundamental assumptions that build up the model, to exploring the potential candidates for a dark energy.

top
### How Galaxies Are Disappearing From Our Universe

As a cosmologist, I can comment that the entire theory being talked about is based on a particular solution of GR, which involves defining a particular metric living on a manifold and endowing it with dynamics via the Einstein field equations - meaning I don't have much argument with what you said.

top
### How Galaxies Are Disappearing From Our Universe

He thinks it violates conservation of mass/energy. It doesn't, of course, but it's not entirely unreasonable to ask whether it does if you don't know where the theory actually comes from.

top
### How Galaxies Are Disappearing From Our Universe

"doesn't that violate some fairly fundamental laws of physics?"

Do you not think that one of the many thousands of theoretical and observational physicists who've worked on this model for decades would perhaps have spotted this flaw at some point in the last eighty years...? Of course it doesn't violate fundamental laws of physics. The whole thing is based tightly on general relativity, so regardless of whether you believe that relativity is being applied accurately to cosmology or not (I don't, not entirely) there is no suggestion of it violating any fairly fundamental laws. Conservation of mass/energy is absolutely guaranteed in relativity. (In two tightly-coupled ways - directly, and via the Bianchi identities which are nothing more than geometric identities along the lines of, but more complicated than, the Pythagoras theorem. Which one you take as more fundamental depends on your philosophy but in relativity the one implies the other.)

The balloon analogy is basically flawed. It's also flawed because it relies on one imagining (to the extent that one can, and no-one can actually do so since our brains didn't evolve to imagine 4d let alone 5d) a 3+1d balloon embedded in a 4+1d spacetime, through the analogy with a 2d balloon embedded in 3d space. This inevitably leads to people understandably querying where the centre is and wondering if it's in the middle of this 4+1d space. It also leads people to understandably ask why the galaxies aren't expanding.

Basically, they're not expanding because the theory doesn't apply in them. There are two ways of viewing this - the simple (but inaccurate) and the headfuck. The simple way of looking at it is that the cosmological expansion is extremely weak and is very easily overpowered by other, more local, forces. So galaxies are easily held together because the gravitational pull between stars in a galaxy is overwhelmingly stronger than the pull of the cosmological expansion. This, unfortunately, does suggest there's some kind of balancing of forces and some kind of spatial expansion, which isn't strictly speaking true.

The headfuck is something that's actually almost impossible to model but straightforward to understand in relativity. The theory that the balloon analogy is based on is Friedman-Lemaitre-Robertson-Walker (FLRW - we're probably missing a name or two in there, as well) cosmology, based on what's known as the FLRW metric, which does nothing more than give the Pythagoras theorem in a 3+1d universe made up of an inverted pyramid of flat 3d spatial surfaces stacked one on top of the other along some time direction. (They could also be a load of nested spheres, or more bewilderingly a pile of saddles, but the data supports the flat model and there's currently no real reason to favour the so called closed or open models.) The FLRW metric applies on scales at which the universe seems to look the same in every direction and wherever you move to. In the jargon, it's "homoegenous and isotropic". Things like the SDSS surveys demonstrate how this can happen quite well -- take a look at http://www.a.phys.nagoya-u.ac.... which is the collection of data from the first SDSS survey (which ended about a decade back, I think; we're on SDSSIII or thereabouts now but I like this figure). On small scales this is obviously really knotty and far from homogeneous, but if you zoom out and squint slightly (to give a form of smoothing) then everything looks the same. Doing this a bit more rigorously, which is notoriously model-dependent, gives the "homogeneity scale" at somewhere in the order of 100Mpc, or about a hundred times larger than a typical galaxy cluster. That's the scale at which the FLRW model applies -- and that's the scale at which every single consequence can be said to hold. Below that, nothing that it says should be taken without a massive pinch of salt. This is particularly true in clusters, which are what is known as 'virialised' and detached from the cosmological expansion -- they're better described by models such as the Lemaitre-Tolman-Bondi or the Szekeres metrics. On smaller, galactic scales, we'd be much better describing galaxies as some horrifically over-complicated cylindrical metric with a whacking lump in the middle. And no matter how you spin it, that thing isn't going to show an expansion.

A caveat to this is that basically any metric can be patched along its edges to an FLRW. So we get things like the vogue for LTB universes in the last decade, given that we can get the effects of dark energy without any actual dark energy if we tune the lump in the middle enough. (This doesn't work, for various reasons. Basically, you can fit the supernova data, but you don't half fuck up the CMB. And the baryon acoustic oscillations. But it's an interesting toy model that shows the kind of impacts that inhomogeneities can actually have without any exotic matter whatsoever.) Or you can patch a Schwarzschild, modeling a single star, to an FLRW and see what would happen to planetary orbits in some mythical universe composed of a single star in expanding spacetime. (They don't like it much.) But if we were able to build an accurate model, we'd actually be patching a foamy structure of LTB and Szekeres metrics together in such a way that they "average out" in one way or another that we also can't define to FLRW. Because FLRW itself does not exist -- the universe merely behaves on large scales as if it were FLRW.

top
### Complex Life May Be Possible In Only 10% of All Galaxies

....

OK, I think this part of this thread is done. If you ever fancy reading that post, and any others I've made on the topic, feel free, and you might see just how little I'm talking "religion", and what a self-important jackass you sound.

top
### Complex Life May Be Possible In Only 10% of All Galaxies

OK, I think we've probably run into the same ambiguity that I mentioned in a different comment in this thread. There are basically two definitions that we might have of this "infinite" business. I'm working from the viewpoint of general relativity -- or some other metric-based theory of gravity -- since I'm wanting to be work with a theory capable of quantifying statements. Given this, I have (at least) two definitions of "infinite" here:

1) The model we are driven to employ, the Friedman-Lemaitre-Robertson-Walker model (or some mildly inhomogeneous or mildly anisotropic generalisation of this) says that the universe extends spatially to an extent that is infinite (in two of the three subcases) and is finite (in the other), and our data is only good enough to state that none of these is preferred over the other (but that given the level of non-flatness, the flat case is *theoretically* preferred, as a theoretical, non-observational, bias.)

2) The past light cone is finite-volumed, as it obviously is, pending some revolution in our understanding.

So far as I understand your point, you're stating that the past light cone is finite-volumed. If that's what you're saying, excellent, we're in agreement. The problem is you used the word "space", which I interpret to mean "space", and the spatial extent of the universe is basically untestable except by reference to the density of the universe; if it is at the critical density, then it is flat (and infinite), and if it is at less than the critical density then it is hyperboloidal (and infinite). If it is greater than the critical density, then it is spherical (and finite).

If we're not talking about spatial extent but instead talking about the 4-volume of the past light cone then we agree.

(I might also add that in GR the entire manifold is basically set -- the "future" of an event (mapped out by the future light cone), the "present" which are regions connected by spacelike geodesics, and then the "past" which is mapped out by the past light cone. From this point of view, which is one that GR forces us to (although it may very well be in contradiction of quantum mechanics), we can't tell if the universe is finite or not temporally. We do know that the past light-cone appears to be finite, but we cannot know what the future light-cone is since we do not have backwards-propagating photons to bring us information along them. The future light-cone could be finite, which would imply that the universe recollapses in the future. The likeliehood right now is that the future lightcone is infinite. That would mean that if we're going to generalise what we mean by "infinite" to four dimensions, we still end up with an infinite 4-volume. However, that's ultimately supposition and extrapolation.

(It can also be commented that actually we don't know that the past light cone itself is finite, since the theory cannot be used to propagate the light cone to, let alone beyond, the singularity. This doesn't actually mean that the universe formed at the singularity; it means that our theory cannot be used to propagate light that near to it - or anything else, for that matter. It may very well be that in a quantum theory of cosmology, there is no singularity, and that we have a bouncing universe. In that event, the past light cone is *not* finite and may very well, in fact, be infinite.

(This part of it though is academic in many ways too since as I commented in another post, we can't see back anything like to the singularity anyway. Our only probe is light, and the universe goes inconveniently opaque at the CMB, which is effectively a photo of the universe when it was a bouncing baby of 300,000 years or so. We simply cannot see beyond this, other than indirectly, unless we manage to observe the gravitational wave background or, even less plausibly, the neutrino background.)

top
### Complex Life May Be Possible In Only 10% of All Galaxies

So I guess the short answer is no it isn't, but yes it is, and there's a bit more to it than that. In my time on Earth, during which I've studied history and physics and worked professionally at cosmology for quite some years, I've learned that "there's a bit more to it than that" is a valid answer to practically anything anyone has ever said. There are always more depths at which one can examine something...

top
### Complex Life May Be Possible In Only 10% of All Galaxies

Thank you for this comment, by the way. I think adding it up I've probably spent quite a lot of time trying to do my best to explain and discuss cosmology on here, so it's nice to know it's appreciated.

top
### Complex Life May Be Possible In Only 10% of All Galaxies

On a practical level, yes it is -- the microwave background stands in our way. As the universe expands it cools down (same as if you pump up a tyre both the pump and the tyre get hot, but in reverse) -- which means that tracking it back, in the past it was seriously fucking hot. The universe is also, even at the present day, composed of more or less 75% hydrogen and 25% helium. The ground state of the lowest energy level of helium isn't really very high, while hydrogen's is high but not *that* high. And if you start working it out, it turns out that if you look back to when the universe was very roughly 300,000 years old, you suddenly find a universe that was at the temperature where every single photon was energetic enough to ionise hydrogen (let alone helium). That then immediately implies that any photon would propagate a short distance (and a very short distance - the universe was vastly smaller then than it is now) before it was absorbed by a hydrogen atom which then spat out its electron. That electron would propagate a short distance before it fell into a proton and emitted another photon, which would almost immediately slam into another hydrogen atom, and so on.

That means that the universe was totally opaque. Using light, and we have no other probe right now (though a direct observation of a gravitational wave background would ease this somewhat, as would a brutally unfeasible detection of a cosmic neutrino background), we therefore have a hard limit back at the CMB.

Even if the CMB somehow wasn't there, yes, looking back we run eventually and inevitably into the beginning of the universe. The distinction is born entirely of the theory we're couching cosmology in -- general relativity or, at least, a geometric theory of gravity very similar to general relativity. In these theories the universe is actually described as a whacking great four-dimensional blob which we've sliced for convenience into spatial surfaces along some time direction. (Those choices aren't arbitrary; there are conditions on what you can choose as a time coordinate, and on what you can choose as a spatial coordinate.) It's those spatial surfaces that seem likely to be infinite in extent.

However, since we're working in something like GR we also have the restriction that we can't see outside of our past light cone. Nothing can propagate faster than the speed of light, and in GR that is actually described by the type of paths that things can propagate along, with light propagating along "null" paths, normal matter along "timelike" paths and either nothing or "tachyons" propagating along "spacelike" paths. These paths, intrinsically, *cannot* overlap. A timelike path will never be a null path and cannot cross one to become a spacelike path. That would be geometrically nonsensical. Spacetime is then mapped out by these "null geodesics", and if you map them back into the past you get what's known as a light cone -- formed of all the light that could possibly have reached us. What we can possibly observe has to come from within, or on the boundaries of, that light cone.

Now, that light cone *is* finite, and if we extend it all the way back to the singularity then we'll obviously run into problems. At a singularity everything genuinely dies, and our theory doesn't even begin to work. Even geometry doesn't. All this tells us is that our dynamical theory is missing something (which many - including myself - believe is a quantum theory of gravity that smooths out that singularity, quite possibly by enforcing a maximum density within a minimum volume, or some similar process). If we decide we're not going to wilfully put a load of infinities in our denominators and instead cut out light cone shortly after this singularity -- let's say when the universe was less than a femtosecond old; I'm sure that's young enough to satisfy everyone -- then we can even calculate the 4-volume of that light cone. And it is finite.

I think this might be one of the causes of the confusion on this point.

I'd also like to apologise again if I cam across as a bit curt.

top
### Complex Life May Be Possible In Only 10% of All Galaxies

No, I'm talking about the spatial extent of the universe. I'm not sure how you're defining "universe" but it doesn't appear to coincide with the definitions used by cosmologists.

I'm also talking from a position that can be backed up by a large amount of both theory and data. The data cannot show that the universe is infinitely extended, but it very definitely does not say that it *isn't* infinitely extended, and the theory actually favours infinitely extended over finite. You're free to disagree but frankly if you can't disagree backed up by a theoretical model that fits both the background cosmological evolution and, more importantly, both the perturbations on the CMB, the shape of hte matter power spectrum, the *oscillations* on the matter power spectrum, *and* the dimming of distant Type 1a supernovae, then you're on a hiding to nothing. You have to fit this data.

I don't like Lambda CDM. Very few people who've examined the fundamentals of cosmology do. It's wrong. It has to be wrong. It's wrong on principle, and it can be proven to be wrong extremely quickly, to those who have been trained. It goes like this: cosmology is a theory built on averages, but those averages are implicit. No-one has a clear idea of how to take an average in general relativity (or any geometric theory of gravity) and end up with another covariant theory. The averages we do have rely on spacetime being globally hyperbolic. This means that there are no geodesic crossings, and this condition is necessary because every current way of averaging tensor fields, including Zalaletdinov's, little though he'd want to admit it, involves taking your tensor field, casting it along null geodesics to the centre of your averaging domain, taking the average, and then casting it back out again. If we have geodesic crossings, then that process is not one-to-one -- the average is ambiguous and is therefore not covariant. Since cosmology is (implicitly, and ham-fistedly) based on averaging a locally-covariant theory, phrased as a globally-covariant theory, cosmology is dead in the water.

The problem is that Lambda CDM works extraordinarily well. Anything that replaces it *has* to fit all the data that Lambda CDM does, at least as well as Lambda CDM, and with fewer of its problems. And that doesn't exist. There is nothing on the horizon that can do that, and I don't see much hope for something to come up. Cosmologists who do examine the fundamentals are aware of this and are trying to find exactly why LCDM emerges as such an astonishingly successful phenomenology - it may well be a thermodynamic theory, in many ways - while everyone else just gets on with it, knowing that it works. Because it does, amazingly well.

And Robertson-Walker cosmology, and the Lambda CDM cosmology that is based on it, do not agree with the strength of your statement. The current universe may be finite. True. Our past light cone is finite. True. Our future lightcone is finite. Probably not true. The universe has a finite spatial extent. No-one has the slightest clue.

top
### Complex Life May Be Possible In Only 10% of All Galaxies

Ah, but here you're falling into one of the common misconceptions about cosmology. The universe isn't expanding *into* anything -- if it were, the universe would have a centre. While observationally we (obviously) can't prove that the universe doesn't have a centre, it's a fundamental principle that if it did we almost certainly aren't in there. This is dubbed the Copernican principle and is one of the key tenets of cosmology. You can build cosmological models that violate the Copernican principle but they all leave you feeling a bit soiled - and even then you end up with cosmologies that typically (but not always) are composed of spatial surfaces of infinite extent.

The key to this is that if we're going to build a model of cosmology we have to employ a theory of gravity. The only serious theories we have are all "metric-based", in that they treat gravity as a geometry theory. (There are very firm reasons for doing so, including the observation that gravity is almost certainly a fictional force, given that it imparts the same acceleration to all objects regardless of mass. To do so implies that either the gravitational and inertial masses must be identical (down to changes of units) or else that the force is fictional, and on the same basis as the likes of centrifugal and Coriolis forces -- very real in the reference frame they're measured in, and zero in a different frame.) In any metric-based theory, if you impose that the universe is isotropic, as it is to a high degree if one observes the microwave background, and then additionally impose that the Earth is not at the exact centre, you're lead basically to Robertson-Walker models. In these models, the universe is composed of a "foliation" of sheets; basically, a lot of three-dimensional surfaces stacked one on the other and filling the whole of the 4-d spacetime. Two of the three RW models are composed of *infinitely extended* spatial slices. Only one of them is composed of finite surfaces. Observation can't tell between them, but there is no more reason to assume we live in a finite universe than an infinite -- and indeed there's a mild hint towards the opposite, in that something so close to flat is most likely flat (and if it isn't could well be open - both are infinitely extended). It's true that this is a theoretical bias, but when observation fails we don't have much to fall back on, and observation leaves the distinct probability that the universe is spatially extended to infinity.

It's all a bit academic, of course, since we can never observe, let alone travel to, that infinite extent - we can only see back within our own light cone and that *is* of finite volume, both spatially and hyperspatially.

top
### Complex Life May Be Possible In Only 10% of All Galaxies

Fancy adding a bit of weight to the random abuse? I don't know where Kjella got the idea that as far as we know there's a finite amount of energy around, for one thing. While there may be a finite amount within our horizon, that's a very different thing, since all we need to do is move a few megaparsecs and we've got a slightly different horizon. The statement that there is a finite number of galaxies within our horizon is completely uncontroversial (and indeed obvious, not least since our horizon extends back before the formation of galaxies at all...), while the statement that there is a finite number of galaxies *outside* our horizon is unsubstantiated and unlikely to be substantiated from that starting point.

On the other hand, gameboyhippo's statement that space is not infinite because it is expanding is also very much arguable. In this case we're on firmer ground, since we can look at the models actually employed in cosmology. If we restrict ourselves to the (Friedman-Lemaitre-)Robertson-Walker models, which are far and away the most widely used, then we've got three of them. One of them is formed of a three-dimensional spherical surface (so not a sphere you'd recognise, but the same in 4d), which is evolving. This is a "closed" universe, and in this model, indeed, space is not infinite but is expanding. But there are two other models. One of them, the universe is composed of a foliation of saddle-shapes (in 4d). This is an "open" universe, and here space is infinite and expanding. Or the universe can be composed of flat sheets. This is (unsurprisingly enough) the "flat" universe, and is *also* infinite and expanding.

It can certainly be argued that the data currently prefers a closed universe, but it does so at a statistically meaningless level (it covers both other cases, well within one standard deviation); the data cannot currently tell us anything. At this point theoretical bias comes in and we have to ask ourselves if there's any good reason that the universe would be almost, but not exactly, flat? The answer is sure, we can come up with reasons, but justifying the actual numbers involved is a posterior exercise. Instead, the preference is for the simpler model, flatness, until the data improves.

Do you see why I might have stated that both statements are so arguable that they're close to meaningless? I could have phrased it better, and I apologise for sounding brusque, but I stand by it.

top
### Complex Life May Be Possible In Only 10% of All Galaxies

Yeah well I'm not going to go into a protracted lecture on parameter estimation here. The point is that the data is consistent with the universe being flat, and the theoretical bias is towards the universe being flat rather than a tiny bit away from it. These taken together explain why so many cosmologists - myself included - simplify the calculations by making the universe exactly flat. Sure, we might be wrong in doing so, because the theoretical bias is just that, but in most contexts the error is genuinely tiny given just how flat it seems to be.