Meaning of Nature
What is Nature:
As nature we call everything that has been formed spontaneously on planet Earth.
In this sense, all living organisms that inhabit the planet (animals and plants), all material and mineral substances (water, earth, iron, etc.) and all the planet's own processes (meteorological phenomena, tectonic plate movement, etc.).
Ecology, as such, is the science that is responsible for studying how this set of elements that interact with each other maintaining a harmonic balance that is governed by its own laws.