Nickel Alloy Background
Nickel is an ideal base for alloys resistant to aqueous corrosion, for the following reasons:
- There is a plentiful supply of nickel, at a reasonable price.
- It is inherently more resistant to corrosion than iron.
- It exhibits a ductile, face-centered cubic structure (known as “gamma” phase, and which is similar to the favored “austenite” in stainless steels) throughout its solid range.
- Beneficial elements, in particular chromium, copper, and molybdenum, are highly soluble in nickel (i.e. they can be added in significant quantities without causing the precipitation of second phases in the microstructure).
- As a result of their high ductility, they are very amenable to wrought processing (hot and cold), fabrication, and welding.
Chromium, copper, molybdenum, and iron are among the elements added to nickel, to enhance its corrosion resistance, or (in the case of iron) to reduce costs.
The primary role of chromium in nickel-based alloys is to enable the formation of protective (“passive”), chromium-rich (oxide or hydroxide) surface films in corrosive solutions of an oxidizing nature. Such solutions induce cathodic reactions of high potential involving oxygen, whereas solutions of a reducing nature induce lower potential cathodic reactions involving hydrogen evolution.
Pure nitric acid solutions are oxidizing, as are many impure solutions of other acids. Impurities with strong oxidizing tendencies include ferric ions and dissolved oxygen. As with steels, which are only regarded as “stainless” when chromium contents exceed approximately 13 wt.%, the corrosion-resistant nickel alloys also require a threshold chromium content to enable passivation in oxidizing solutions. This is believed to be around 15 wt.%. More typical in the most versatile nickel alloys are contents ranging from 16 to 23 wt.%.
Copper, which is mutually soluble in nickel (i.e. all mixtures of the two elements exhibit a single FCC structure, in the absence of other alloying elements), enhances the resistance of nickel in seawater and reducing acids, especially hydrofluoric. It is used in small quantities in some of the chromium-bearing alloys of nickel, but is a major constituent of several corrosion-resistant nickel alloys (associated with the MONEL® trademark), with copper contents at around 30 wt.%.
Molybdenum ennobles nickel and therefore enhances its resistance to reducing acids, i.e. those that induce a cathodic reaction involving the release of hydrogen. Such acids include hydrochloric and sulfuric, the most commonly encountered industrial corrosives. Since atoms of molybdenum are relatively large, it also strengthens the gamma solid solution.
Tungsten (from the same group of elements, and with an even larger atomic size) is used as a partial substitute for molybdenum in some alloys. The solubilities of molybdenum and tungsten, especially in the presence of other elements such as chromium, are limited. However, molybdenum levels of 15 to 20 wt.% are possible in chromium-bearing nickel alloys, and molybdenum contents of 30 wt.% are feasible in nickel alloys with only minor additions of other elements.
As already mentioned, the reason for adding iron is usually economic, either to allow the use of less expensive charge materials during melting, or to produce materials which bridge the cost/performance gap between the corrosion-resistant nickel-based alloys and the austenitic and duplex (austenitic/ferritic) stainless steels. One of the problems of adding iron is that it reduces the solubilities (in the nickel-rich, gamma solid solution) of more beneficial elements, such as chromium and molybdenum, thus restricting the use of these elements, or causing the presence of second phases deleterious to ductility and/or corrosion resistance.
Other elements sometimes added to the wrought, corrosion-resistant nickel alloys (albeit in small amounts) include:
- Aluminum, either for oxygen control during melting, or (at slightly higher levels) to induce the precipitation of fine “gamma-prime” particles in the microstructure, for strengthening purposes. While gamma-prime (as a second phase) reduces corrosion resistance to some extent, gamma-prime strengthened versions of various nickel alloys have been commercially successful.
- Manganese, for sulfur control during the melting process.
- Titanium, to tie up any residual carbon (and/or nitrogen) in the form of stable carbides and/or carbo-nitrides, or to participate in the formation of fine, strengthening, gamma-prime precipitates.
- Niobium (columbium), which can also tie up residual carbon, and (at slightly higher levels) give rise to an alternate, fine, strengthening precipitate known as “gamma-double prime”.
Carbon and silicon are undesirable residuals in most wrought, corrosion-resistant, nickel-based alloys, and sophisticated techniques (such as argon-oxygen decarburization, or AOD) are employed during melting to minimize the contents of these two elements. They are undesirable because they are not very soluble in nickel, and can give rise to deleterious precipitates, particularly at the grain boundaries, both during hot working and welding (of annealed material).
A key step in the production and fabrication of these alloy is solution annealing, followed by rapid cooling (quenching). This enables the dissolution of unwanted second phase precipitates (due to over-alloying, or the presence of residuals, such as carbon and silicon) that might have occurred during hot (or warm) working, and the “locking-in” of this largely, single- phase structure. Meta-stability is generally only a problem in the heat-affected zones of subsequent welds, where grain boundary precipitation can give rise to preferential attack of the boundaries in certain corrosive media. Structural changes during service are of little concern, since the intended use temperatures of these materials are below those required to cause significant diffusion.
Cobalt Alloy Background
Despite their close proximity in the Periodic Table, there are substantial differences between the atomic structures and characteristics of nickel and cobalt. Like nickel, cobalt is inherently resistant to corrosion, and can accommodate high levels of beneficial elements. However, cobalt exhibits two atomic forms:
- A low temperature hexagonal close packed (HCP) form.
- A high temperature face-centered cubic (FCC) form.
The transformation temperature of pure cobalt is 417°C. Alloying elements such as nickel, iron, and carbon (within its limited soluble range) are known as FCC stabilizers, and suppress the transformation temperature. Chromium, molybdenum, and tungsten, on the other hand, are HCP stabilizers and increase the transformation temperature.
In reality, the transformation is extremely sluggish, and not easily brought about by either heating or cooling. Indeed, after solidification from the molten state (or after solution annealing and quenching, in the case of wrought products), cobalt and cobalt alloys (with elevated transformation temperatures) normally exhibit metastable FCC structures at room temperature. However, partial transformation to HCP is easily induced by cold-work (i.e. plastic deformation at room temperature).
The transformation of cobalt and cobalt alloys under the action of mechanical stresses is believed to progress by the creation of wide stacking faults (the FCC form of the materials having very low stacking fault energies) and by subsequent coalescence. Extensive micro-twinning is also observed in plastically-deformed, cobalt-based alloys.
Chromium provides the same benefits to cobalt as it does to nickel, i.e. it is key to the formation of protective films/scales in both corrosive fluids and high-temperature gases. Moreover, it influences the driving force for structural change in cobalt and its alloys, which in turn affects their mechanical and wear behavior.
The primary role of nickel (if present) in the cobalt-based is to stabilize the FCC form. This negatively impacts wear performance, but provides many benefits, especially ease of wrought processing (at sufficiently high nickel contents).
Molybdenum and tungsten are both strong, solid-solution strengthening agents in cobalt-based alloys. They also result in higher transformation temperatures, which increase resistance to those forms of wear that involve a micro-fatigue component (such as metal-to-metal sliding and cavitation erosion). Molybdenum is used in those cobalt alloys developed primarily for resistance to aqueous corrosion and wear. Tungsten is used in those wrought cobalt alloys developed for high temperature use and those cast (and weld overlay), high-carbon alloys developed primarily for wear resistance in hostile environments.
In the cast (and weld overlay) cobalt alloys with relatively high carbon contents (i.e. from 0.5 to 3.5 wt.%), chromium, molybdenum, and tungsten also encourage the formation of carbides within the microstructure. These carbides (chromium-rich M7C3 and M23C6, and molybdenum/tungsten-rich M6C) are very beneficial under low stress (two-body) abrasion conditions.
As with the nickel-based alloys, iron can be used to reduce cost, particularly if it allows the use of ferro-compounds or iron-contaminated scraps in the charge materials during melting. However, it can also be used as an alternate FCC stabilizer (rather than nickel), to decrease the transformation temperature and make the alloys more amenable to wrought processing and fabrication.
The solubility of carbon in cobalt is higher than that in nickel; thus, there is less need to minimize carbon in wrought, corrosion- and wear-resistant, cobalt-based alloys. Furthermore, carbon is an important minor addition to the wrought, high temperature alloys (both of cobalt and nickel), and a major addition to those cast and weld overlay, cobalt alloys developed primarily for resistance to wear. Its purpose in the high temperature alloys is for strengthening, through the formation of sparsely dispersed carbides. Its purpose in the wear-resistant alloys is to generate high volume fractions of carbide in their microstructures, to increase their cutting and deformation resistance.