I am trying to apply a shading effect to text in an image using CIFilter.shadedMaterial() in SwiftUI. The idea is to take an input image with text, generate an edge map using CIFilter.heightFieldFromMask(), and then apply a shading texture from another image. However, I am facing two issues:
- Sometimes, the output appears as a single solid color instead of applying the shading properly.
- Other times, the entire image turns into a single uniform shade, losing all details.
Code:
import SwiftUI
import CoreImage
import CoreImage.CIFilterBuiltins
struct ShadedMaterialDemo: View {
let context = CIContext(options: nil) // ✅ Reuse CIContext
@State var outputImage: UIImage?
var body: some View {
VStack {
if let outputImage = outputImage {
Image(uiImage: outputImage)
.resizable()
.scaledToFit()
} else {
Text("Processing Image...")
}
}
.onAppear {
Task {
outputImage = await applyEdgeWorkFilter(intensity: 10)
}
}
}
func applyEdgeWorkFilter2(radius: Float) async -> CIImage? {
guard let inputImage = UIImage(named: "imgBGText") else { return nil }
guard let ciImage = CIImage(image: inputImage) else { return nil }
let filter = CIFilter.heightFieldFromMask()
filter.inputImage = ciImage
filter.radius = radius
return filter.outputImage
}
func applyEdgeWorkFilter(intensity: Float) async -> UIImage? {
guard let edgeImage = await applyEdgeWorkFilter2(radius: 10) else { return nil }
guard let shadeImage = UIImage(named: "imgShade"), let shadeCIImage = CIImage(image: shadeImage) else { return nil }
let filter = CIFilter.shadedMaterial()
filter.inputImage = edgeImage
filter.shadingImage = shadeCIImage
filter.scale = intensity
guard let resultImage = filter.outputImage else { return nil }
// ✅ Reduce memory usage by downscaling the output
let finalExtent = resultImage.extent.intersection(CGRect(x: 0, y: 0, width: 1024, height: 1024)) // Example size
if let cgImage = context.createCGImage(resultImage, from: finalExtent) {
return UIImage(cgImage: cgImage)
}
return nil
}
}
struct ShadedMaterialDemo_Previews: PreviewProvider {
static var previews: some View {
ShadedMaterialDemo()
}
}
What I Expect:
- The shading image should apply texture only to the edges of the text, not to the entire image.
- The output should retain details from the original image rather than turning into a single color.
Apple Doc - Shaded
Main Image
Shaded Image
I am trying to apply a shading effect to text in an image using CIFilter.shadedMaterial() in SwiftUI. The idea is to take an input image with text, generate an edge map using CIFilter.heightFieldFromMask(), and then apply a shading texture from another image. However, I am facing two issues:
- Sometimes, the output appears as a single solid color instead of applying the shading properly.
- Other times, the entire image turns into a single uniform shade, losing all details.
Code:
import SwiftUI
import CoreImage
import CoreImage.CIFilterBuiltins
struct ShadedMaterialDemo: View {
let context = CIContext(options: nil) // ✅ Reuse CIContext
@State var outputImage: UIImage?
var body: some View {
VStack {
if let outputImage = outputImage {
Image(uiImage: outputImage)
.resizable()
.scaledToFit()
} else {
Text("Processing Image...")
}
}
.onAppear {
Task {
outputImage = await applyEdgeWorkFilter(intensity: 10)
}
}
}
func applyEdgeWorkFilter2(radius: Float) async -> CIImage? {
guard let inputImage = UIImage(named: "imgBGText") else { return nil }
guard let ciImage = CIImage(image: inputImage) else { return nil }
let filter = CIFilter.heightFieldFromMask()
filter.inputImage = ciImage
filter.radius = radius
return filter.outputImage
}
func applyEdgeWorkFilter(intensity: Float) async -> UIImage? {
guard let edgeImage = await applyEdgeWorkFilter2(radius: 10) else { return nil }
guard let shadeImage = UIImage(named: "imgShade"), let shadeCIImage = CIImage(image: shadeImage) else { return nil }
let filter = CIFilter.shadedMaterial()
filter.inputImage = edgeImage
filter.shadingImage = shadeCIImage
filter.scale = intensity
guard let resultImage = filter.outputImage else { return nil }
// ✅ Reduce memory usage by downscaling the output
let finalExtent = resultImage.extent.intersection(CGRect(x: 0, y: 0, width: 1024, height: 1024)) // Example size
if let cgImage = context.createCGImage(resultImage, from: finalExtent) {
return UIImage(cgImage: cgImage)
}
return nil
}
}
struct ShadedMaterialDemo_Previews: PreviewProvider {
static var previews: some View {
ShadedMaterialDemo()
}
}
What I Expect:
- The shading image should apply texture only to the edges of the text, not to the entire image.
- The output should retain details from the original image rather than turning into a single color.
Apple Doc - Shaded
Main Image
Shaded Image
Share asked Feb 10 at 11:32 HeWhoRemainsHeWhoRemains 6111 bronze badges1 Answer
Reset to default 0Your "Shaded" image needs to be the "shader" only...
This is what I get with your code, and un-edited images:
If I clip out your "shader" image:
I get this: