Is the average global temperature a good way to measure global warming?



  1. 0 Votes

    Yes, but it’s not perfect. The Intergovernmental Panel on Climate Change measures global warming “based on an increasing trend in global average temperature over time” (1). Scientists observe the linear trend in average global temperature to see by how much it is rising. However, average global temperature has not been measured for all of history, and therefore, comparisons past 100 or 150 years ago aren’t completely accurate. Plus, not everyone attributes the rise in average global temperature to manmade global warming.

  2. 0 Votes

    It’s one of them, but there are flaws. It’s difficult to attribute the temperatures taken in different areas to climate change, as other factors like urban warming as well as seasonal variables may be present. It can be used to better understand the climate, but it’s only a small piece of a larger picture. Other ways to measure climate change include atmospheric CO2 levels as well as ice cap melting.

  3. 0 Votes

    It is, yes, though with the caveats described above. It also depends on whether one is examining average sea or average land temperatures. Average global sea temperatures are considered a better indicator than similar land-based measurements, because the oceans are less prone to anomalous temperature fluctuations. Looking at the warming oceans depicts a bleak picture of climate change – according to NOAA, 2010 tied with 2005 as the third warmest year on record ( see here). NOAA actually recently compiled a study examining ocean temperature fluctuations over the last two decades. The results were that the rate of warming currently documented in the world’s oceans is both faster than previously expected and  inconsistent with expected natural fluctuations. 

Please signup or login to answer this question.

Sorry,At this time user registration is disabled. We will open registration soon!