This is my first day in Swift programming and till now we are using Objective C. I tried to write simple addition program it works. Like,
var i = 10
var j = 10
var k = i + j
println(k)
But when I change one of the values to float it gives error.
var i = 10
var j = 10.4
var k = i + j
println(k)
Error: main.swift:13:11: Could not find an overload for '+' that accepts the supplied arguments
Now I did Google search and tried few thing e.g. Double(i+j)
, but it doesn't work. Swift should implicitly convert int to float in this case, isn't it?
Please suggest if I am doing any mistake understanding Swift language.
Depending on what you want your result to be, you should convert it to the appropriate type using that types init method.
eg.
If I want a double for example in my result
if I want an integer instead, please note the double will be truncated.
The problem I see in your example is you're trying to do an add operation and then convert it but both values needs to be the same before you perform the addition.
Apple has made it quiet strict to avoid type mis-match and conversion errors. Sometimes this can be a bit 'too strict' for dev coming from other languages, I was annoyed at first but I got used to it.
try this:
You could define your own operator...
...but I feel this is going against the spirit of the language (and as @TheLazyChap says, it depends what you want, which may not always be the same).