This is my first day in Swift programming and till now we are using Objective C. I tried to write simple addition program it works. Like,
var i = 10
var j = 10
var k = i + j
println(k)
But when I change one of the values to float it gives error.
var i = 10
var j = 10.4
var k = i + j
println(k)
Error: main.swift:13:11: Could not find an overload for '+' that
accepts the supplied arguments
Now I did Google search and tried few thing e.g. Double(i+j)
, but it doesn't work. Swift should implicitly convert int to float in this case, isn't it?
Please suggest if I am doing any mistake understanding Swift language.
Depending on what you want your result to be, you should convert it to the appropriate type using that types init method.
eg.
var myInt = 5;
var myDouble = 3.4;
If I want a double for example in my result
var doubleResult = Double(myInt) + myDouble;
if I want an integer instead, please note the double will be truncated.
var intResult = myInt + Int(myDouble)
The problem I see in your example is you're trying to do an add operation and then convert it but both values needs to be the same before you perform the addition.
Apple has made it quiet strict to avoid type mis-match and conversion errors. Sometimes this can be a bit 'too strict' for dev coming from other languages, I was annoyed at first but I got used to it.
You could define your own operator...
// Put this at file level anywhere in your project
operator infix + { }
@infix func + (a: Int, b: Double) -> Double {
return Double(a) + b
}
@infix func + (a: Double, b: Int) -> Double {
return Double(b) + a
}
let i = 10
let j = 10.4
let k = i + j // 20.4
...but I feel this is going against the spirit of the language (and as @TheLazyChap says, it depends what you want, which may not always be the same).
try this:
var i = 10 //Int Type
var j = 10.4 //Double Type
var k = Double(i) + j //result is now Double Type
println(k)