## Abstract

Energy is conserved by the first law of thermodynamics; its quality degrades constantly due to entropy generation, by the second law of thermodynamics. It is thus important to examine the entropy generation regarding the way to reduce its magnitude and the limit of entropy generation as time tends to infinity regarding whether it is bounded or not. This work initiates such an analysis with one-dimensional heat conduction. The work not only offers some fundamental insights of universe and its future, but also builds up the relation between the second law of thermodynamics and mathematical inequalities via developing the latter of either new or classical nature. A concise review of entropy is also included for the interest of performing the analysis in this work and the similar analysis for other processes in the future.

## 1. Introduction

It is well known that classical theory of heat conduction is based on the Fourier law of heat conduction and the first law of thermodynamics, with an aim of predicting and controlling the rate at which heat is conducted [1]. The former is the constitutive relation of heat flux that correlates the temperature gradient (the driving force and the cause of heat-conduction process) and the heat transfer rate (the outcome and the effect of heat-conduction process) [1,2]. The latter is the conservation relation of energy, one of the most fundamental laws of nature stating that energy cannot be created or destroyed [1–4]. The classical and general approach of studying heat-conduction processes consists mainly of two steps: (i) obtaining temperature fields by either from heat-conduction equations or from experimental measurements, and (ii) obtaining heat transfer rate and the way to control it via the Fourier law of heat conduction. The classical heat-conduction equation comes from the application of the first law of thermodynamics to heat conduction with the Fourier law of heat conduction as the constitutive relation of the heat flux rate [1]. The results from such a classical study of heat conduction contain mostly the temperature field, the heat transfer rate, and the way to controlling them.

We promote a step further beyond the classical theory of heat conduction by applying the second law of thermodynamics to heat conduction, a typical irreversible process that satisfies both the first and the second laws of thermodynamics. There are various expressions to illustrate the nature of the second law of thermodynamics [2–7]. Each is typically convenient to apply for some particular processes. For heat conduction, we use the following expression, which is known as the increase of entropy principle [3]: *the entropy generation S*_{gen} *of a system during a process always increases, or, in the limiting case of a reversible process, remains constant*, i.e. d*S*_{gen}/d*t* ≥ 0 *with t being the time*. This requires that *S*_{gen}(*t*_{2}) ≥ *S*_{gen}(*t*_{1}) for all *t*_{2} ≥ *t*_{1}. Applying d*S*_{gen}/d*t* ≥ 0 and *S*_{gen}(*t*_{2}) ≥ *S*_{gen}(*t*_{1}) can then yield classical or new mathematical inequalities and uncover solution features of heat-conduction equations.

It is well known that the classical macroscopic definition of entropy offers little on its physical interpretation/meaning [3,8–20]. Indeed, our understanding and appreciation of entropy mainly rely on using it in commonly encountered processes and systems. To address the macroscopic definition and physical meaning of entropy, both the first and second laws of thermodynamics were applied in [20] to show that the heat exchanged between the environment and a system undergoing a totally reversible process, *C* can be any positive constants [20]. The introduction of *C* is to recover the classical definition. However, it is more convenient and desirable for illustrating the physical meaning of entropy and performing entropy analysis with *C* being 1 (no unit such that *S* with an energy unit). The entropy *S* at any state is, thus, an unavailable portion of the system energy at that state which cannot be converted into work. This definition is identical to the classical one by choosing the constant *C* as the inverse of the environment temperature (so that the classical definition of entropy with a unit of *J*/*K*) [20].

Energy is conserved by the first law of thermodynamics [3]; entropy represents the part of system energy that cannot be transformed into useful work [20] so that any entropy generation will degrade the quality of energy [20–22]. It is thus important to examine d*S*_{gen}/d*t* regarding the way to reduce its magnitude and *S*_{gen} well controlled, the future of human being will be positive if *S*_{gen} is bounded for all processes. We attempt to address these important questions with heat conduction process, a typical example of irreversible processes. This differs fundamentally from other studies of the second law analysis [23–28] that mainly aim for improving/upgrading/optimizing performance of practical processes.

## 2. Entropy and entropy generation

In this section, we present a concise review of entropy that is critical for performing entropy analysis of any processes including heat conduction. The review includes the Clausius inequality, the classical definition of entropy, the calculation of entropy and entropy changes, and the entropy balance for closed systems that involve no mass exchange with their surroundings.

### (a) Entropy

Consider any material systems, the second law of thermodynamics can be used to derive, for any cycle [3],
*δq* is heat amount transferred per unit mass of material inside the system, *T* Kevin temperature, subscript b stands for system boundary. This equation is called the Clausius inequality [3]. The equality holds for reversible cycles and the inequality holds for irreversible cycles. The Clausius inequality can be rewritten as
*a*, we have
*s* by
*x*, in principle, by
*s *= 0; the saturated liquid at 0.01°C is normally used as the reference state for water [3]. To find *s _{x}*, we must know

*δq*/

*T*on system boundary during a reversible process from the reference state to the state

*x*, which is normally not available. The practical way to find entropy or entropy change between any two states is by thermodynamic relations that come from the first and second laws of thermodynamics.

Entropy is a state property. It can be expressed in terms of more familiar properties like temperature *T*, pressure *p*, specific volume *v* and specific heats *C _{v}* and

*C*through the two d

_{p}*s*relations [3]. These relations come from the analysis of a closed system subject to a reversible process that does moving boundary work and has heat added in general. Note that d

*s*does not depend on the process. We can select any process to find it.

The first law of thermodynamics reads, for the closed system in differential form on a per unit mass basis,
*δw* is the work amount and d*u* is the change in internal energy. For a reversible process that does boundary work and has heat added, the second law of thermodynamics leads to
*s*, or Gibbs equation [3]. By applying the definition of enthalpy, *h *=* u *+* pv*, equation (2.13) becomes
*s* equation [3]. While the two d*s* equations are obtained via applying both the first and second laws of thermodynamics to a reversible process, they are valid for both reversible and irreversible processes as entropy is a thermodynamic state property and its change between two states is independent of the type of process. Equations (2.15) and (2.19) are relations between the properties of a unit mass of material as it undergoes a change of thermodynamic state, and are applicable whether the change occurs in a closed or an open system [3]. They also yield the same d*s*; the first and second d*s* equations work conveniently for known *C _{v}*, d

*T*and d

*v*and for known

*C*, d

_{p}*T*and d

*p*, respectively.

Incompressible substance is a good approximation for solids and liquids. For incompressible substances, *v* is constant and *C _{v}* =

*C*(

_{v}*T*). We have, thus, from the first d

*s*equation,

*s*from State 1 to State 2 yields

*T*

_{2 }−

*T*

_{1}|, is sufficiently small,

*T*= 1

*K*is selected to be reference state, the entropy at any state

*x*is

### (b) Entropy balance

Consider a cycle made of an irreversible process 1-I-2 and a reversible process 2-R-1 in figure 1*b*, the Clausius inequality yields,
*s*_{gen} ≥ 0 is the entropy generation during I process. By using the definition of entropy, we arrive at the entropy balance equation

## 3. Temperature field

Consider heat conduction in a solid bar of length *l* with constant material properties and specified temperature gradient at both ends, the second kind or Neumann boundary condition [1]. As the contribution of a non-homogeneous boundary condition to the temperature field can be represented by source and initial terms (see Appendix for the proof), we can focus our attention on the following initial-boundary value problem with homogeneous boundary conditions without loss of the generality
*x*, *t* and *T* are position, time and temperature, respectively. *a*^{2} is the thermal diffusivity, *f*(*x*, *t*) is the rate of heat generation inside the bar per unit volume and per unit specific capacity of the material. The heat generation may be due to nuclear, electrical, chemical, gamma-ray or other sources that may be a function of time and/or position. *φ*(*x*) is the initial temperature distribution of the bar. Its solution is [1]
*T _{φ}*(

*x*,

*t*) is the contribution of

*φ*(

*x*) that satisfies

*T*(

_{f}*x*,

*t*) comes from the contribution of

*f*(

*x*,

*t*) that satisfies

## 4. Entropy analysis and mathematical inequalities

### (a) Entropy evolution and generation

For the three cases of *f*(*x*,*t*) = 0, *φ*(*x*) ≠ 0; *f*(*x*,*t*) ≠ 0, *φ*(*x*) = 0 and *f*(*x*,*t*) ≠ 0, *φ*(*x*) ≠ 0, the total entropy in the bar is, respectively, by applying equation (2.23),
*A* stands for the cross-section area of the bar, *T _{φ}* and

*T*are available in equations (3.4) and (3.10), respectively. Also,

_{f}The total entropy generation in the bar up to time instant *t* is, by equation (2.25), for the three cases of *f*(*x*,*t*) = 0, *φ*(*x*) ≠ 0; *f*(*x*,*t*) ≠ 0, *φ*(*x*) = 0 and *f*(*x*,*t*) ≠ 0, *φ*(*x*) ≠ 0, respectively,

Therefore,

The evolution of *S _{φ}*,

*S*,

_{f}*S*, d

_{φf}*S*/d

_{φ}*t*, d

*S*/d

_{f}*t*and d

*S*/d

_{φf}*t*is shown in figure 2 in their dimensionless format with

*φ*(

*x*) = 5 + cos (

*πx*/

*l*) and

*f*(

*x*,

*t*) = e

^{2−t}, where the overbar stands for dimensionless entropy normalized by

*ρAlC*, the thermal capacity of the bar. The most striking feature is that all

_{v}### (b) Second law of thermodynamics and mathematical inequalities

The second law of thermodynamics requires

As they are valid for all *l*, the localization theorem [31] can then be applied to obtain,

As

## 5. Concluding remarks

By its very essence, entropy represents the part of system energy that cannot be transformed into useful work and is always generated in all processes by the second law of thermodynamics. This work of examining one-dimensional heat-conduction process, a typical and important irreversible process, shows a finite value of entropy generation as the time tends to infinity no matter whether the heat conduction is initiated by the initial temperature distribution, the heat source, the non-homogeneous boundary condition or their combination. Such an analysis is important by noting that the job of human being is to make entropy generation well controlled, the future of human being depends on whether the entropy generation is bounded. The example of such an analysis in this work has also yielded mathematical inequalities of new or classical nature, which not only build up the relation between mathematical inequalities and the second law of thermodynamics, but also offer an innovative way of studying differential equations. While the one-dimensional heat conduction is selected in this work, the analysis promoted in the present work works equally for the two-dimensional and three-dimensional heat conduction and the other processes. As the initial attempt of such analysis, this work contains a concise review of entropy that is critical for performing entropy analysis of any process.

## Authors' contributions

All of the authors have provided substantial contributions to the conception and design of the model, interpretation of the results and writing the article. All authors have given their final approval of the version to be published.

## Competing interests

The authors of the paper have no competing interests.

## Funding

L.W. is grateful to the Research Grants Council of Hong Kong (GRF 17237316, 17211115, 17207914 and HKU717613E), the University of Hong Kong (URC 201511159108, 201411159074 and 201311159187), the Zhejiang Provincial, Hangzhou Municipal and Lin'an County Governments for their financial support. X.L. thanks the NSAF (U1430101) and the Natural Science Foundation of Shandong Province (ZR2012AM019) for their financial support.

## Appendix A

Consider heat conduction in a solid bar of length *l* with constant material properties. Both ends of the bar are with specified temperature gradient, the second kind or Neumann boundary condition [1]. The temperature field in the bar is the solution of following initial-boundary value problem in the form of [1]
*x*, *t* and *T* are position, time and temperature, respectively. *a*^{2} is the thermal diffusivity, *f*(*x*,*t*) is the rate of heat generation inside the bar per unit volume and per unit specific capacity of the material. The heat generation may be due to nuclear, electrical, chemical, gamma-ray or other sources that may be a function of time and/or position. *μ*_{1}(*t*), *μ*_{2}(*t*) and *φ*(*x*) are specified temperature gradients at the two ends and initial temperature distribution of the bar.

Let
*μ*_{1}(*t*) and *μ*_{2}(*t*) to the temperature field can be represented by source and initial terms. We can thus focus our attention on the initial-boundary value problem with homogeneous boundary conditions without loss of the generality.

- Received May 18, 2016.
- Accepted September 5, 2016.

- © 2016 The Author(s)

Published by the Royal Society. All rights reserved.